You may have experienced the phenomenon at a baseball game, a fireworks display or when measuring the distance you are from a storm by counting the seconds thunder hits after lightning: the delay between what we see and what we hear is more obvious over long distances because sound travels much more slowly than light.
But new research from the University of Rochester reveals our brains are also able to process sound delays that are too short to be noticed consciously and, by doing so, fine-tune what our eyes see.
Duje Tadin, associate professor of brain and cognitive sciences at the University of Rochester is senior author of the study. "Much of the world around us is audiovisual," he says. "Although humans are primarily visual creatures, our research shows that estimating relative distance is more precise when visual cues are supported with corresponding auditory signals. Our brains recognize those signals even when they are separated from visual cues by a time that is too brief to consciously notice."
Tadin says his team has discovered how humans can unconsciously detect sound delays as short as 40 milliseconds (ms). "Our brains are very good at recognizing patterns that can help us," said Phil Jaekl, Tadin's colleague. "Now we also know that humans can unconsciously recognize the link between sound delays and visual distance, and then combine that information in a useful way."
The researchers used projections of three-dimensional images to test how sound delays are used by the brain to estimate the relative distance of objects. In the first experiment, participants were asked to adjust the relative depth of two identical shapes until they appeared to be at the same distance when viewed through special 3-D glasses. As each shape appeared it was accompanied by an audible click. The researchers adjusted the timing of the click so it came either just before or just after the shape appeared.
Those taking part in the study consistently perceived a shaped that was followed by a very slightly delayed click as being more distant. "This surprised us," Jaekl says. "When the 3-D shapes were the same distance, participants were consistently biased by the sound delay to judge the shape paired with the delayed click as being further away — even though it wasn't."
In a follow-up experiment, participants were shown three-dimensional shapes that were quickly moved either toward or away from them. When the shape appeared with a sound delayed by 42 milliseconds, the team found that participants were more likely to perceive it as more distant, even in cases when the object was actually shifted toward the participant. Most importantly, when an object that was shifted away was paired with the sound delay — a relationship consistent with the natural world — participants were able to judge relative distance with greater precision.
"It's striking that this bias is unconscious — participants were unable to consciously detect when sound delays were present, yet it had great influence over their perception of distance," says Jaekl. The results are published in the online journal PLOS ONE. It seems that not only can you tell in a moment that your friends are around but exactly how far away they are, too.