Washington, July 13 (ANI): An observer can detect an image displayed too quickly to be seen, only if he or she first hears the name of the object, according to cognitive psychologists at the University of Pennsylvania and University of California
Through a series of experiments, researchers found that hearing the name of an object improved participants' ability to see it, even when the object was flashed onscreen in conditions and speeds (50 milliseconds) that would render it invisible.
Surprisingly, the effect seemed to be specific to language.
A visual preview did not make the invisible target visible.
Getting a good look at the object before the experiment did nothing to help participants see it flashed.
The study demonstrated that language can change what we see and can also enhance perceptual sensitivity.
Verbal cues can influence even the most elementary visual processing and inform our understanding of how language affects perception.
Led by psychologist Gary Lupyan of the Department of Psychology at Penn, researchers made participants to complete an object detection task in which they made an object-presence or -absence decision to briefly presented capital letters.
Other experiments within the study further defined the relationship between auditory cues and identification of visual images.
The researchers found that no matter what position on screen the target showed up the effect of the auditory cue was not diminished, an advantage over visual cues.
Researchers also found that the magnitude of the cuing effect correlated with each participant's own estimation of the vividness of their mental imagery.
Using a common questionnaire, researchers found that those who consider their mental imagery particularly vivid scored higher when provided an auditory cue.
The team went on to determine that the auditory cue improved detection only when the cue was correct-that is the target image and the verbal cue had to match.
According to researchers, hearing the image labelled evokes an image of the object, strengthening its visual representation and thus making it visible.
"This research speaks to the idea that perception is shaped moment-by-moment by language. Although only English speakers were tested, the results suggest that because words in different languages pick out different things in the environment, learning different languages can shape perception in subtle, but pervasive ways," said Lupyan.
The study is published in the journal PLoS ONE. (ANI)