- News Home
6 March 2014 1:04 pm ,
Vol. 343 ,
Magdalena Koziol, a former postdoc at Yale University, was the victim of scientific sabotage. Now, she is suing the...
Antiretroviral drugs can protect people from becoming infected by HIV. But so-called pre-exposure prophylaxis, or PrEP...
Two studies show that eating a diet low in protein and high in carbohydrates is linked to a longer, healthier life, and...
Considered an icon of conservation science, researchers at World Wildlife Fund (WWF) headquarters in Washington, D.C.,...
The new atlas, which shows the distribution of important trace metals and other substances, is the first product of...
Early in April, the first of a fleet of environmental monitoring satellites will lift off from Europe's spaceport in...
Since 2000, U.S. government health research agencies have spent almost $1 billion on an effort to churn out thousands...
- 6 March 2014 1:04 pm , Vol. 343 , #6175
- About Us
Silent Lips: Music to the Mind's Ear
24 April 1997 8:00 pm
Anyone who has struggled to converse at a noisy party knows that eye contact aids listening. But scientists have been in the dark about exactly how facial movements help the brain decipher babble. Now a team reports in today's issue of Science* that the act of watching lips--even those moving silently--stimulates a brain region that processes sound and could amplify the signal from the ears. The finding could also shed light on how babies learn to talk.
The influence of sight on speech was first demonstrated over 20 years ago. In a classic experiment, subjects watched a face silently pronounce "ga" while hearing a voice say "ba." Curiously, the subjects reported perceiving an entirely different sound: "da." Such crossed signals are what makes watching a badly dubbed movie so frustrating, says neuroscientist Gemma Calvert of Oxford University in the United Kingdom, whose team set out to see how the brain itself responds to visual cues linked to speech.
The team placed volunteers inside a magnetic resonance imaging scanner, which estimates brain activity from blood flow. Each of the five volunteers listened to spoken numbers between one and 10. As expected, brain regions responsible for processing sound and language lit up. But tests in which the subjects watched a video of a face silently mouthing the numbers produced a surprise. The imager detected activity not only in the visual cortex but also in the primary auditory cortex--a basic processing station for sound--and in a nearby language region called Wernicke's area.
A second experiment revealed that the visual cues seem to amplify the signal sent from the primary auditory cortex to language centers. When the team combined the tapes of spoken numbers with the video of the mouth, the primary auditory cortex was about a third more active than when just the tapes were played. "It might be like turning the volume knob up," says Calvert. The language-processing centers were also more active, although they showed a slighter increase.
"It's a clever, straightforward finding," says Joseph Rauschecker, a neuroscientist at the Georgetown University Medical Center. The study's implications, he suggests, go far beyond cocktail party chitchat. By stimulating language centers in the brain, visual cues could be helping babies learn to imitate their mother's speech, he says.