All the Right Moves: Learning With Gestural Mobile Devices
Ever notice that people talk with their hands?
So have the creators of smart phones, iPads and other latest-generation tools that are rapidly consigning the keyboard-and-mouse, point-and-click era to the dustbin of technological history.
These changes reflect a growing body of research showing
that gestures are not random, but in fact correspond to ideas being expressed,
said John Black, TC’s Cleveland A. Dodge Professor of Telecommunications and
Education, in introducing a panel of students and alumni who are working at the
cutting edge of learning technology.
Ayelet Segal (Ph.D. ’11) told listeners that in her dissertation research, she found that children who played certain mathematics learning games on iPads performed better and were more engaged than when using a mouse version. But the gestures the app requires have to be the right ones, she said. “If the child counts individual blocks, the gesture has to be discrete. But for estimating a number on a continuous like, a continuous gesture” – such as swiping one’s finger along a line – “supports performance better.” Segal has applied her research in an app, titled MathGlow, released by the company she founded, iGeneration.
Michael Swart, a TC Doctoral Research Fellow, showed clips of researchers talking to children about concepts used in teaching fractions: What is the whole? What are the parts? Are some parts bigger or smaller than others? “Watch the gestures the child makes,” Swart said. Different gestures -- such as grasping, pointing, making a gathering motion -- apply to different steps on the way to learning fractions and putting them in use. “We’ve found out what gestures kids use when talking about fractions,” Swart said. Now his team is developing a game, illustrated by cartoon characters from the educational television program Cyberchase, to put these results into effect.But every interface has its limits as well. Inputting information remains awkward, as anyone who has tried to write a long message on an iPad knows, said Nabeel Ahmad (Ed.D. ’09), a TC adjunct professor and a Learning Developer at IBM. “The new big thing is voice,” Ahmad said, pointing to Apple’s “Siri” tool as an illustration. Segal said researchers are now studying the cognitive embodiment of freeform, three-dimensional gestures. Touchscreen devices have quickly opened new horizons for learning and made new tools possible, but the next revolution might be coming even faster.
Published Tuesday, Apr. 23, 2013