Multi-sensory learning methods using gestures and pictures facilitate remembering words

pictures facilitate learning

© MPI f. Human Cognitive and Brain Sciences/ v. Kriegstein

Pictures facilitate learning: our brain remembers the words.



"Atesi" -- what sounds like a word from the Elven language of Lord of the Rings is actually a Vimmish word meaning "thought." Scientists from the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig have used Vimmish, an artificial language specifically developed for scientific research, to study how people can best memorise foreign-language terms. According to the researchers, it is easier to learn vocabulary if the brain can link a given word with different sensory perceptions. The motor system in the brain appears to be especially important: When someone not only hears vocabulary in a foreign language, but expresses it using gestures, they will be more likely to remember it. Also helpful, although to a slightly lesser extent, is learning with images that correspond to the word. Learning methods that involve several senses, and in particular those that use gestures, are therefore superior to those based only on listening or reading.

For most students, the very thought of learning new vocabulary evokes a groan. Rote learning of long lists of words must surely be one of the most unpopular types of schoolwork. That said, many schools and language courses have now understood that learning outcomes improve if vocabulary, for example, is presented not just as a word, but also as an image. The multisensory learning theory states that the brain learns more easily when several senses are stimulated in parallel.


The results obtained by the Leipzig-based researchers confirm this. For their study the scientists used Vimmish, an artificial language they developed themselves, which follows similar phonetic rules to Italian. This ensured that the vocabulary was equally new to all participants. Over the course of a week, young women and men were to memorise the meaning of abstract and concrete Vimmi-nouns under different conditions. In the first experiment, the subjects heard the word and then observed a corresponding image or a gesture. In the second experiment, they symbolically drew the corresponding word in the air or expressed it with a gesture. The researchers then checked whether the participants could still recall the term at different times after the learning period.


"The subjects' recollection was best in relation to terms they themselves had expressed using gestures. When they heard the term and its translation and also observed a corresponding image, they were also better able to remember the translation. By contrast, however, tracing a term or observing a gesture was no better than just hearing the term," explains Katja Mayer of the Max Planck Institute for Human Cognitive and Brain Sciences. The way a term was learned was even reflected in the subjects' brain activity. In this way, areas of the brain responsible for the motor system were active when a subject translated a term previously learned through gesture, while areas of the visual system were active in the case of words learned with the help of images.


This suggests that the brain learns foreign words more easily when they are associated with information from different sensory organs. It may be that these associations are mutually reinforcing, imprinting the source-language term and its translation more deeply in the mind. "If for example we follow a new term with a gesture, we create additional input that facilitates the brain's learning," says Katharina von Kriegstein, head of the study at the Max Planck Institute for Human Cognitive and Brain Sciences. The scientists now want to discover whether the activity in the motor and visual centres is actually the cause of the improved learning outcomes. They plan to do this by activating the neurons in these regions using electrodes and measuring the impact on learning outcomes.


It is not only in learning vocabulary that the multisensory principle applies; other studies have shown that multisensory input also facilitates word recognition in the subject's own language. "If we're on the phone with someone we know, for example, the areas of the brain responsible for facial recognition are active during the phone call. It seems that the brain simulates the information not being captured by the eyes and creates it for itself," explains von Kriegstein.


Thus, we learn with all our senses. Taste and smell also have a role in learning, and feelings play an important part too. But does multisensory learning work according to the principle: the more senses, the better? "That could well be so," says von Kriegstein, "but we don't know how much the learning outcomes improve with the addition of more senses. Ideally, however, the individual sensory impressions should match one another. In other words, to learn the Spanish word for apple, the subject should make an apple gesture, taste an apple or look at a picture of an apple."


Chomsky: We Are All – Fill in the Blank.

This entry passed through the Full-Text RSS service - if this is your content and you're reading it on someone else's site, please read the FAQ at http://bit.ly/1xcsdoI.


Categories: