2011 TC Pressroom
Teachers College, Columbia University
Teachers College Columbia University

TC Media Center from the Office of External Affairs

Section Navigation

How Smart Can We Get?

Images

Making Learning Stick

MAKING LEARNING STICK John Black and Saadia Khan are exploring new dimensions in embodied cognition, which emphasizes learning through sensory experience.

Arguing for Metacognition

ARGUING FOR METACOGNITION Kuhn preaches the value of argumentation with peers

by Elizabeth Dwoskin

These are heady times in the world of education technology. Tools such as iPads, Kinect and Wii increasingly provide opportunities for "embodied cognition" -- the acquisition of knowledge through touch, movement, gesture, sight, hearing and other activities involving the body as well as the intellect. Interactive software generates data that can illuminate which teaching strategies work best for students in a particular classroom, school or district. Even low-tech approaches, such as debate via chat function or manipulation of objects on screen, seem to intensify focus.

It's easy to imagine a future world of super-smart people, using technology to solve all kinds of daunting challenges. But the new digital age is actually prompting questioning of traditional measures of intelligence. 

"The focus on IQ and smartness is a big mistake," says Herbert Ginsburg, TC's Jacob H. Schiff Foundation Professor of Psychology and Education. "We should be talking about skills, learning strategies and understanding concepts." 

What might a new world of enhanced education ideally look like? Would there be any downsides? And what's needed to make the dream a reality?

Teachers College has pondered such questions since its founding. A century ago, the psychologist Edward Lee Thorndike conducted the first scientific studies of how animals learn. In the 1970s, Ernst Rothkopf, a scientist at Bell Laboratories, described the "mathemagenic" learner, who uses what he or she already knows to solve new problems in ways that are personally relevant. Rothkopf, later TC's first Cleveland E. Dodge Professor of Telecommunications and Education, helped blaze the trail for subsequent generations of cognitive researchers and technology developers.  

 
TAPPING HUMAN POTENTIAL

Technology is flashy stuff, but in education, the best methods at each stage of life often build on skills and knowledge we already possess.  

"We say young children are like sponges because their brains handle tremendous amounts of new stuff in preconscious, unscripted ways," says Karen Froud, Associate Professor of Speech & Language Pathology and Director, Neurocognition of Language Lab. "A three-year-old learns to distinguish a transitive from an intransitive verb, or "Ahh" from "Uh," perhaps through some statistical sampling ability, by constantly filtering the frequency of what she hears."
But while you can put a sponge next to water, you have to immerse it to get it to drink.

"There's this idea that if you let kids play they'll learn everything they need to know," says Ginsburg. "I think that's exaggerated."

Ginsburg believes that very young children have innate capabilities and rudimentary conceptual understandings that education could do more to tap. Through hours of videotaped sessions, he has demonstrated findings that children as young as 18 months have a sense of "everyday math" that includes number operations, shape, pattern and cardinality.   

Software called MathemAntics, developed by Ginsburg, uses child-friendly visuals to "give us an opportunity to see what's in kids' minds as they work on both informal and formal mathematical problems." For example, MathemAntics might display groups of elephants and asks which group has more -- the one with five elephants or the one with three. Complicating the challenge, elephants in the group of three may be larger. "If a child gives the wrong answer, we know he still needs to learn that the size of the object does not matter when you are counting," Ginsburg explains.   
Sandra Okita, who develops humanoid robots, sees potential in robots to help kids themselves reshape and reflect on their own understandings. 

Okita, Assistant Professor of Technology and Education, believes technological artifacts such as robots, agents and avatars "consist of strong social components that enable students to share knowledge and ideas and to develop a -'peer-like' relationship that may reveal new insights into the role of social relationships in learning." Okita has found that children seem to learn more from interacting with peer-like robots than with a robot that functions as an authority figure. 

In Okita's classroom of the future, a fourth-grader might work with a two-foot-high robot to solve a math problem. The robot will have a name and speak in a child's voice, and instead of doing the teaching, it will take guidance from the child. When unsuccessful, the robot may seem perplexed or confused. Through this "recursive feedback in learning by teaching," the child may discover that the problem lies not in her teaching method, but in her deeper lack of understanding of the concept.  
"With a peer, students not only think in more depth when tasked with teaching, but also monitor feedback from others thus learning to monitor themselves," Okita says. "By carefully designing this feedback, I'm trying to create an ideal peer learner for each student."  

As students mature, they increasingly benefit from such "metacognition," or conscious articulation of what they know and how they know it.    

"The amount of information is exploding, to the point where we can no longer hope to teach kids more than a smattering of what's out there," says TC's Deanna Kuhn, Professor of Psychology and Education. "Instead, students must be able to recognize what information they need, how to find and make sense of that information, and how to apply it where it's needed. Argumentation with peers is a way to hone those skills."  

The new Common Core Standards adopted by 45 states over the past two years heavily emphasize argumentation. Kuhn, meanwhile, engages Harlem middle-schoolers in twice-weekly electronic dialogues on social issues. After two years, they write better essays on a new topic than a comparison class devoted to whole-group discussion and practice in essay writing. The electronic debaters benefit from fielding direct challenges to their claims, Kuhn says, but technology also allows them to reflect on a written record of the dialogue. Verbal dialogue, in contrast, disappears as soon as it's been spoken.        

Another powerful instructional approach harnesses the fact that, as Charles Kinzer, Professor of Psychology and Education puts it, "technology is inherently motivating." Kinzer has overseen TC developmental projects such as Lit2Quit, a smoking cessation game that mimics certain physical sensations provided by cigarettes. Now other TC faculty members are tapping the features through which online games have engaged 90 percent of all teenagers and become a $50 billion industry.
In a research project known as the "Gamification of Education," Joey J. Lee, Assistant Professor of Technology and Education, studies the incorporation of game elements and principles (missions and quests, progress bars of "experience points," in-class power-ups) into classrooms to enhance learning.  "Well-designed games naturally afford behaviors and mindsets that are good for learning," Lee says. "Can the principles of games be used in classrooms to change how people learn? Can we use game mechanics to afford exploration, collaboration, risk-taking and problem-solving? Can we cultivate a mastery orientation and a winner's mindset -- persistence, gaining experiences and skills, and learning through failure? Well-designed games do this through rapid feedback, missions and larger quests as structured tasks and goals to achieve."

Adult learners, too, can benefit from virtual environments.  

"The stuff you learn in school tends to be very thin," says John Black, TC's current Cleveland E. Dodge Professor of Telecommunications and Education. "You don't understand it very well and you forget it almost immediately -- it doesn't affect the way you think about the world." That, Black says, is because formal education overemphasizes symbolic learning and downplays learning through a full sensory experience involving sight, sound, movement, the body and imagination.
Black, who chairs TC's Department of Human Development, and his students have established a hierarchy of effectiveness in embodied cognition comprising, from least to most, watch, do, feel, move. More recently, Black and Saadia Khan, TC Adjunct Assistant Professor and Post-Doctoral Research Fellow, have explored a realm they call "surrogate embodied cognition," which adds two new dimensions to that hierarchy: imagine and emote. For example, graduate students who used role play in the virtual realm Second Life to enhance their reading of the history of the Timurid dynasty in the 15th century Mughal Empire (now Northern India, Pakistan and Bangladesh) remembered the material in more detail, understood it better and also reported greater motivation to learn.  

"Embodied experiences will make what you've learned become a part of you," Black says. "Role-playing activities helped the students imagine and feel the world referred to in the text they are learning, deepening their understanding and improving their memory." (Read a profile of Saadia Khan on page 110.)


TOO MUCH OF A GOOD THING?

Some who believe technology is already fostering a more stimulating intellectual environment point to the fact that IQs are increasing -- on average by a few points with each generation. But researchers also have found that creativity has been declining among all Americans since the early 1990s and among children in particular. When that news broke in 2010, many experts immediately blamed the "creativity crisis" on the number of hours kids spend in front of computers and TVs.   
John Black dismisses such charges, arguing that the problem isn't technology but rather the ways in which our education system deploys it. 

"Learning today focuses on standard solutions to standard problems," says Black. "Instead we must increase ability to imagine possibilities, to see how possibilities change as conditions change and to see how one can act to bring about desired possibilities."   

Black believes that technology can engender two skills -- systems thinking and computational thinking -- that traditional education has largely ignored.

"Systems thinking involves viewing the world as a system of entities linked by functional relations," he says. "One implication of such thinking is realizing that any action we take may have indirect as well as simple effects.  For example, if we cut the government budget to reduce a deficit, it may indirectly reduce employment and purchasing, which then reduces tax revenue and so in the end increases the deficit."  

Computational thinking includes the ability to consider "general forms of problems one is solving, and what general solutions for such problems might be," Black says. That's precisely the skill employed by computer programmers who must create code to govern a range of possible actions. Computational thinking also emphasizes problem decomposition (breaking down complex problems into simpler ones). Black and his students study the teaching of computational thinking skills via a visual computer programming language called Scratch, which enables kids to drag and drop objects that represent conditions and consequences.

"The student thinking in these programs then gets embodied in an avatar surrogate that moves through a virtual world on the computer screen -- or alternatively, in a robot surrogate moving in the real world," Black says. 

Charles Kinzer argues that technology can play a vital role in embedding learning in precisely the kinds of unstructured situations that allow young children to discover and create. 

"The critics aren't looking at technological advances that have changed what kids can do," he says. "Kids can play now and not be in front of a screen.  They can play with two or three people at a time in the next generation of Kinect [Microsoft's motion-sensing device for its X-Box 360 video game console and Windows PCs]. "You can play a game, whether it's educational or entertainment, and you're not tied to a mouse. There's a wide sensory field, so you can do things together."

In education, Kinzer says, the next frontier is "wearable computing" -- clothing with embedded technology that will monitor what children do.  "Instead of asking, -'What's two plus two?' you'll play a game where you combine two of something with two of something else, and you use the four -- and that shows you understand the concept."  


SO HOW DO WE GET THERE?

In medicine, researchers are learning how a person's individual genetic makeup, combined with environmental factors, determines the illnesses he is susceptible to, or the medicines to which she might best respond. Now the federal government is planning a vast brain-mapping effort modeled on the Human Genome Project, spurred in part by technologies such as fMRI and EEG that reveal precisely where and how learning occurs in the brain. 

It may be years before we can target learning interventions to people's genetic makeup. But researchers using those same diagnostic tools, coupled with empirical evidence culled from deep analysis of student work and behavior, are revealing how people learn best at different stages of life or in different subject areas.   

Consider President Obama's recent proposal to make the federal Head Start program available to all four-year-olds. While some researchers worry the plan will make the pre-K experience overly academic and others argue that Head Start confers no lasting benefits (see story on page 32), Karen Froud hopes to settle such debates with evidence from the only source she considers truly definitive: the brain. Using EEG, which measures the brain's real-time responses to different stimuli, Froud will assess whether  five-year-olds who have had two years of pre-K process certain stimuli differently, and if their brains are more responsive to learning environments, than peers who have had only one year.   

"The real science of learning is about what we're doing to the brain when we're exposing people to different kinds of experiences," Froud says. "As we understand that, we'll know what strategies are effective, in what domains, given at what dosages, so to speak, for how long and at what stages of human development." 

For Froud the real goal is not to make humans a smarter species, but instead to enable all people to tap the potential they already have. 

"The way we go about education does leave people behind," Froud says. "We need to look at the processes of education, not the products -- how the brain is doing this, not just the results -- and that will help us close the gap."

previous page