THURSDAY, Jan. 30, 2014 (HealthDay News) — New research into human understanding of language suggests that the brain comprehends speech by picking up on certain kinds of sounds — so-called acoustic signatures.
“This is a very intriguing glimpse into speech processing,” study senior author Dr. Edward Chang, a neurosurgeon and neuroscientist at the University of California, San Francisco, said in a university news release. “The brain regions where speech is processed had been identified, but no one has really known how that processing happens.”
Researchers wanted to track the brain’s response to spoken sentences that include all the speech sounds in the English language. To do so, they monitored the neural activity of six patients who were undergoing surgery for epilepsy as they listened to 500 English sentences spoken by 400 different people.
The researchers found that parts of the brain are tuned in to general acoustic signatures instead of specific speech sounds like those made by the letters B or Z.
“By studying all of the speech sounds in English, we found that the brain has a systematic organization for basic sound feature units, kind of like elements in the periodic table,” Chang said.
The study appears in the Jan. 30 issue of the journal Science Express.
More information
For information on speech and communication disorders, try the U.S. National Library of Medicine.
Copyright © 2024 HealthDay. All rights reserved.