Subscribe
 
 

Silent Lips: Music to the Mind's Ear

24 April 1997 8:00 pm
Comments

Anyone who has struggled to converse at a noisy party knows that eye contact aids listening. But scientists have been in the dark about exactly how facial movements help the brain decipher babble. Now a team reports in today's issue of Science* that the act of watching lips--even those moving silently--stimulates a brain region that processes sound and could amplify the signal from the ears. The finding could also shed light on how babies learn to talk.

The influence of sight on speech was first demonstrated over 20 years ago. In a classic experiment, subjects watched a face silently pronounce "ga" while hearing a voice say "ba." Curiously, the subjects reported perceiving an entirely different sound: "da." Such crossed signals are what makes watching a badly dubbed movie so frustrating, says neuroscientist Gemma Calvert of Oxford University in the United Kingdom, whose team set out to see how the brain itself responds to visual cues linked to speech.

The team placed volunteers inside a magnetic resonance imaging scanner, which estimates brain activity from blood flow. Each of the five volunteers listened to spoken numbers between one and 10. As expected, brain regions responsible for processing sound and language lit up. But tests in which the subjects watched a video of a face silently mouthing the numbers produced a surprise. The imager detected activity not only in the visual cortex but also in the primary auditory cortex--a basic processing station for sound--and in a nearby language region called Wernicke's area.

A second experiment revealed that the visual cues seem to amplify the signal sent from the primary auditory cortex to language centers. When the team combined the tapes of spoken numbers with the video of the mouth, the primary auditory cortex was about a third more active than when just the tapes were played. "It might be like turning the volume knob up," says Calvert. The language-processing centers were also more active, although they showed a slighter increase.

"It's a clever, straightforward finding," says Joseph Rauschecker, a neuroscientist at the Georgetown University Medical Center. The study's implications, he suggests, go far beyond cocktail party chitchat. By stimulating language centers in the brain, visual cues could be helping babies learn to imitate their mother's speech, he says.

* For more details, Science Online subscribers can link to the full text of the Report.

Posted In: