WEDNESDAY, Nov. 25 (HealthDay News) — People can “hear” not only with their ears, but also with their skin, new research shows.
In fact, sensations on the skin designed to mimic certain types of speech actually helped people decipher sounds better, the Canadian scientists found.
“We have never been able to show whether we could use tactile information in this way,” said Bryan Gick, co-author of a letter to the editor in the Nov. 26 issue of Nature.
At this point, the research has more implications for basic science, for “how perception works,” explained Gick, an associate professor of linguistics at the University of British Columbia in Vancouver. “We’re picking up on this information, and integrating it seamlessly [in the brain].”
But, he added, “once we understand the mechanics, it’s much easier to see how applications could grow out of it. Perhaps we could design a perceptual aid [for people with hearing impairments] or special headphones for pilots to distinguish sounds and noises.”
Scientists already knew that visual cues — looking at a person’s face or lips, for instance — can help someone figure out what that person is saying, but little research has looked into the tactile side of things.
Traditional thought held that one hears with the ears and sees with the eyes, with each of these perceptions linked to a separate part of the brain.
More recent research, however, has suggested that the senses merge when interpreting sights or sounds. “The brain doesn’t care where the information comes from,” Gick said. “It picks up from different senses.”
If sight and sound don’t match, for example, what you’re seeing can actually override what you’re hearing.
“People would report having heard what the eyes tell me,” Gick said.
These researchers designed their study around the fact that language includes both aspirated sounds such as “pa” and “ta,” which involve air coming through the mouth, and unaspirated sounds such as “ba” or “da,” which don’t involve this expulsion of air.
Small puffs of air were delivered through vinyl tubing to the skin and neck of 66 volunteers. When the unaspirated sounds “ba” and “da” were paired with a puff of air (mimicking an aspirated sound), the participants thought the sounds were actually “pa” and “ta.”
“The nature of tactile stimulation can influence the actual part of speech you can perceive,” said Robert Frisina Jr., associate chair of otolaryngology at the University of Rochester Medical Center, in Rochester, N.Y. “People with hearing impairments could have significant improvement when they’re provided with tactile cues,” he noted.
“The findings are pretty novel and provocative. You wouldn’t expect that kind of [difference] from a little puff of air,” Frisina added. “The areas of the brain for touch and for hearing are connected. Neurologically, it does make sense.”
“Individuals are really picking up on certain clues that we may not necessarily be aware of,” said Dr. Thomas Brammeier, director of the Hearing and Balance Center at Scott & White in Temple, Texas.
More information
The University of California Santa Cruz has more on how hearing works.