Washington, Jan 24 (ANI): The movement of facial skin plays a key role not only in the way the sounds of words are made, but also in the way they are heard, says a new study.
"How your own face is moving makes a difference in how you 'hear' what you hear," said first author Takayuki Ito, a senior scientist at Haskins Laboratories, a Yale-affiliated research laboratory.
When, Ito and his colleagues used a robotic device to stretch the facial skin of "listeners" in a way that would normally accompany speech production they found it affected the way the subjects heard the speech sounds.
The subjects listened to words one at a time that were taken from a computer-produced continuum between the words "head" and "had."
When the robot stretched the listener's facial skin upward, words sounded more like "head." With downward stretch, words sounded more like "had." A backward stretch had no perceptual effect.
And, timing of the skin stretch was critical - perceptual changes were only observed when the stretch was similar to what occurs during speech production.
These effects of facial skin stretch indicate the involvement of the somatosensory system in the neural processing of speech sounds.
The study shows that there is a broad, non-auditory basis for "hearing" and that speech perception has important neural links to the mechanisms of speech production.
The study "Somatosensory function in speech perception" has been published in the Proceedings of the National Academy of Sciences (PNAS). (ANI)