TUESDAY, July 27 (HealthDay News) — During a conversation, the brain activity of both listener and speaker may look remarkably similar, especially when the two are really understanding each other, a new study finds.
Researchers asked 11 participants to listen to a recording of a woman recounting an amusing, stream-of-consciousness story about being asked to the senior prom when she was a high school freshman.
Brain scans taken by functional MRIs showed the activity in the listeners’ brains looked very similar to the brain activity of the woman who was telling the story, a process the researchers call “neural coupling.”
“There is much more commonality between the process of producing speech and comprehending speech than one might have thought,” said study author Greg Stephens, a postdoctoral fellow at Princeton University. “The more coupling there is, the more the speaker and the listener are using similar mechanisms.”
Brain scans further showed that in some areas of the brain, “coupling” occurs at the same time the speaker is talking, while in other areas, the coupling lags, Stephens said. Sometimes, brain activity in the listener’s brain comes before the activity in the speaker’s brain, suggesting the listener may be anticipating what the speaker is going to say.
Such mirror imaging may aid in comprehension, Stephens said. After listening to the story, participants were given a questionnaire measuring how well and how deeply they comprehended the story.
Brain scans of those who scored the best on the comprehension score and seemed to have the most nuanced understanding of the story showed the most complete “neural coupling” with the speaker, possibly hinting at why some people click during conversation and some don’t, Stephens said.
“There was a strong correlation between how much of the listener’s brain matched the speaker’s brain and how well the listener understood the story,” Stephens said.
When participants were asked to listen to someone speaking in Russian, a language none of the participants knew, brain scans showed no such “neural coupling.”
“If your brain is really similar to mine, I might use my own brain to predict what your brain is doing,” Stephens said. “That might be really beneficial for our understanding of each other.”
The study is published in this week’s online edition of the Proceedings of the National Academy of Sciences.
The finding builds on a prior study by the same team that showed people’s brain activity looks alike while watching the same video clips. This new study uses an innovative technical means to see what happens during speech and comprehension, said David Poeppel, a professor of psychology and neural science at New York University.
“The fact that the parts of the brain underlying production and comprehension of language are the same is not surprising,” Poeppel said. “This study is a really nice verification of the hypothesis that the organization for language in the brain is very robust and uniform across individuals.”
Testing what’s going on during an actual conversation would be difficult. Not only is it unpredictable what people are going to say, neural processing of speech takes mere milliseconds, while a functional MRI is a much slower, cruder means of measuring changes in blood flow in the brain, Poeppel said.
Paul Sanberg, a professor of neurosurgery and director of the Center of Excellence for Aging and Brain Repair at the University of South Florida, called the research “interesting.”
“This study is potentially very important in understanding how our brain works when we communicate with others, which eventually could lead to future therapies for communication disorders and patients with brain damage,” Sanberg said.
More information
The U.S. National Science Foundation has more on how the brain processes language.