Yesterday I visited Durham’s Technology-Enhanced Learning Research group who had invited me to see their multi-touch interactive desk that hit the headlines recently. It was in fact my Mum who sent me a newspaper clipping about the device and following my TeachMeet presentation Dr Liz Burd invited me to visit.

It is always thrilling to see the birth place of new technologies and I have been priveleged to have seen two such concept labs. It is motivating and inspiring to meet innovative people such as the team working at Durham University and equally exciting to hear their open philosophy towards the interactive desk development. 

Andrew Hatch and his pride and joy

The interactive desk is one element of a much bigger picture approach to this research and development looking to redefine what a collaborative learning environment can be. Durham have won funding for 4 years for the project. It is very refreshing as it is not just about a single device or product but deals with as they state in their grant proposal “the design of an educational technology that is strongly supportive of social pedagogy.” They call the wider learning environment concept “SynergyNet”. 

This learning environment will be technology rich, where ICT is seamlessly integrated into the fabric of a classroom but the technology does not intrude on the main focus of the activity (Smith and Harrison 2001). Our enthusiastic claims for the positive impact of this technology on learning are based on its ability to facilitate classroom dialogue and pupil collaboration. Central to SynergyNet is a new form of desk that contains a large built-in multi-touch surface.

The team go on to explain that:

This research aligns directly to TLRPs (Teaching and Learning Research Programme) evidence-informed pedagogic Principle 7 (James and Pollard, 2007) that effective teaching and learning foster not only individual but also social processes and outcomes. Thus this research aims:

  • Aim 1: To create a radically new technology-rich learning environment that integrates with traditional classroom layouts and collective activities.
  • Aim 2: To design and implement a new form of user interface for educational multi-touch systems.
  • Aim 3: To formulate a new pedagogy that eases transition and movement between teacher-centric and pupil centric interaction.
  • Aim 4: To analyse pupils’ learning strategies to inform fundamental research by capturing data as pupils use the SynergyNet environment

Doctors Liz Burd, Andrew Hatch (seated in the picture above) and Phyo Kyaw played as my hosts for the day and showed me their Techno Cafe which was the inspiration for the SynergyNet project. It was an informal learning space for leactures and seminars, divided up into small booths in the style of a diner. Each booth was rich with technology: SMARTBoard, hard wiring available for tablets and other devices, speakers, cameras to monitor the activity in the booths from a central teacher’s podium.

The multi-touch desk itself has been developed with learning in mind from the beginning and actually using it was very exciting. However the design was unexpected, it was a large podium with the surface itself at about a 40 degree angle. The surface itself was a synthetic fabric like a drum skin and Liz Burd explain that they had tried all sorts of different surfaces to facilitate touch and she thought a tracing paper texture would be ideal.

As I have said the open approach to the project was a refreshing change and they openly encouraged me to take pictures and video and to blog about the project. Here are three videos I took currently available on YouTube – please use them in your own blog posts and to show staff to instigate discussion.

Both of these simple application are very much to prove the code and application architecture that underpins the use of the device. You can see that they are simple and rudimentary but this is the first step. We talked about the possibilities of applications and I was delighted to hear that Liz, Andrew and Phyo would be willing to take ideas and contributions from educators who are working with a range of different age groups.

This is the inner workings of the device and you can see the projector/camera/infra red construction, again their willingness for me to film “behind the scenes” underlines their open source philosophy to many of the project elements.

In fact the project’s research outcomes clearly state these ideas:

  • A revolutionary learning environment using integrated ICT – We will develop free, open-source software to enable pupils to use the SynergyNet multi-touch tables and teachers to control the immersive classroom environment.
  • A new integrated pedagogy – Through the use and the design of the SynergyNet environment, we will evolve a new technology-supported social pedagogy.
  • A data capture system – We will develop free, open-source software to enable researchers automatically to capture video and audio data and simultaneously record user-interactions with PC or multi-touch technology.
  • A data-rich repository of classroom activity – We will record pupils’ collaborative exchanges (verbal and non-verbal) as they use the system. This will used to inform the evaluation and evolution of the research but the richness of the data means that it has great potential to support other research projects within TLRP and beyond.

Our final conversation of the day centred on the next steps for the SynergyNet project and I raised the huge potential that social networking tools have in terms of gathering ideas and insight from wider education communities. I have agreed to help the team from Durham to facilitate the way our networks can make an impact on this research and how the voices of many teachers and educators could contribute to their project aims.

You have the opportunity to contribute to the ongoing development of this exciting project as at this stage the team need ideas. Contribute your ideas and thoughts for development using the newly created Flickr group “Multi-Touch Interactive Desk: Applications and Gesture Ideas”

You can contribute in two ways.

  1. Possible learning activities from any age range that would benefit from multi-touch capabilities. Upload screenshots or photos of classroom activities that could be transformed with multi-touch. Ensure you explain what you are adding and your ideas for how it could be improved.
  2. It is also a place to suggest gestures that could be developed for the device, think the iPhone “pinch” and “twist” but what else would you like? Upload a diagram or better still a short video of the gesture and what it would do.
     

The potential for this project is huge and if it continues to listen to the voices of wider communities it should have a strong and exciting future. No doubt we will explore the prospects of other online tools to gather your ideas but for now take a look again at the films and think what could you have done differently in your classroom with that sort of tool? What activities could you imagine with many of these desks working together in a classroom? Why not show your students the films and encourage them to suggest their ideas.

I know that the team would love to hear your comments, reactions and learning activity ideas – whether here, on the YouTube videos or with an image or video contribution to the Flickr group. I think this is a wonderful opportunity for us to help define the future of classroom interactive devices and not just be the consumer, so please get the word out and let’s see if we can make a difference.

You never know maybe in years to come you will have these devices at your school and you could say you played your part.

13 comments

  1. Great stuff! This new technology seems to be arising from the “realm of possibilities” all over the world. I love the water simulation; it works just like the Koi Pond application for the iPhone. I believe it is still one of the most popular iPhone apps available.

    By the way, 2nd video no longer available on YouTube.

    Cheers!

  2. It’s Andy here – the guy who looks decidedly grumpy in the photograph – I guess the shutter released just before I realised what was going on and therefore before a smile was able to creep across my face!

    Just wanted to (a) extend our thanks to Tom for visiting us, and for really engaging with what we’re doing, and in turn, giving us great opportunities to widen our net out further. And (b) mention that we have integrated the ODE physics engine (www.ode.org) via jmephysics.

    Tom will also be pleased to hear that I fixed a 3D issue over the weekend that had spoilt an otherwise great game of real-physics dominos!!

  3. Really exciting!

    I’d like to see it work with music sequencing or notation software (cubase, logic, sibelius) to allow for group composition. It would work really well with Ableton Live for remixing existing tracks.

  4. That is one amazing piece of equipment!
    The water application looks like it might not need too much modification to turn it into a really cool ripple tank for use in Physics, avoiding usual puddles on floor etc.
    I’m thinking that phenomena like interference and diffraction could be indentified much more clearly due to the display brightness and the potential to annotate what is going on.
    Funny (Phunny?) how 3 out of the first 4 comments want Physics as a feature!
    As an aside, I wish my dentist had one of those in his waiting room }:8(

  5. Simulated archaeology with collaboration – anything involving collaborative reveal functions. Being able to find, search and explore augmented reality objects.

  6. Very, very impressive. All the more so as the software’s open source 🙂
    Curious to see how easily text input would work with something like this, but given the advances apple have made with the iphone, this shouldn’t be too much of a stretch.
    Physics stuff and 3D would be nice, as Joe comments above.
    Dynamic geometry (like geogebra et al) and data exploration (like inspiredata) would be work well, I think.
    Could imagine video storyboarding and photo editing working well.
    As a desktop interface, 3D is going to be important – sifting through piles of stuff (in effect) as well as moving things around the surface – some sort of flicking through / turning things over interface, perhaps?

  7. It’s fantastically exciting stuff, I can’t wait to have a play with one!

    I’d like all of the above plus;

    Tick and cross gestures

    a multi-touch Phun (http://www.phunland.com/wiki/Home) or any decent physics engine

    A lockout to briefly “turn off” the tables.

    Ability to use office programs a-la smartboard

    ability to post stuff to the main whiteboard and back

    a help gesture would be neat

    3d object manipulation linked in with atom models, google earth etc

    a set of 10 of them

    moon on a stick

    TA!

  8. The first use that comes to my mind which I’m sure the developers have already thought of is for collaborative group work. I’d like 4 students to be able to brainstorm a project together by writing straight on the desk – take a separate corner of the desk each to work on a different section of the brainstorm, maybe by writing something up and integrating it with pictures and multimedia and then be able to drag it all back into the middle to share together. I guess in a nutshell I’d like seamless integration of handwriting on the table with multimedia and then collaborative features to bring it all together with different students!

Leave a Reply

Your email address will not be published. Required fields are marked *