Home » Technology » Ear sounds reveal eye movements

Ear sounds reveal eye movements

summary: Researchers have discovered that ears make subtle sounds in response to eye movements, allowing them to determine where a person is looking.

Studies show that ear sounds, potentially caused by muscle contractions or hair cell activation, can reveal eye position.

These findings challenge existing beliefs about the function of the ear, which suggest that ear sounds can help synchronize the perception of sight and sound. The team’s innovative approach could lead to new clinical hearing tests and a deeper understanding of sensory integration.

Key facts:

  1. The research revealed that subtle ear sounds are associated with eye movements, providing insight into where a person is looking.
  2. This phenomenon is most likely caused by the brain coordinating eye movements with ear muscle contractions or hair cell activation.
  3. These findings open up the possibility of new clinical tests and a better understanding of how the brain integrates visual and auditory information.

source: Duke University

Scientists can now determine where a person’s eyes are looking just by listening to their ears.

“You can actually predict eye movements, and the position of targets that the eyes will see, just from recordings made with a microphone in the ear canal,” said Jennifer Groh, Ph.D., senior author of the study. book. New report, and a professor in the departments of psychology and neuroscience and neuroscience at Duke University.

One project group focused on differences in eye and ear movement sounds in people with hearing or vision impairments. Credit: Neuroscience News

In 2018, Groh’s team discovered that the ear makes subtle, imperceptible sounds when the eyes move. In a new report appearing the week of November 20 in the magazine Proceedings of the National Academy of SciencesNow, the Duke team shows that these sounds can reveal where your eyes are looking.

It also works in reverse. Just by knowing the direction a person is looking, Groh and his team were able to predict the shape of subtle sound waves in the ear.

Groh believes that these sounds may occur when eye movements stimulate the brain to contract middle ear muscles, which normally help dampen loud sounds, or hair cells, which help amplify quiet sounds.

The exact purpose of this ear-squeaking sound is unclear, but an initial hunch is that it might help sharpen people’s perception.

“We think this is part of a system that allows the brain to adapt to the location of sights and sounds, even though our eyes can move while our head and ears do not,” Groh said.

Understanding the relationship between subtle ear sounds and vision could lead to the development of new clinical tests of hearing.

“If each part of the ear exerts its own regulation of tympanic signals, it could be used as a kind of clinical tool to evaluate which part of the ear’s anatomy is at fault,” said Stephanie Lovich, one of the study’s lead authors. This paper and a graduate student in psychology and neuroscience at Duke University.

Just as the pupil of the eye contracts or dilates like the aperture of a camera to regulate the amount of light entering, the ear also has its own way of regulating hearing. Scientists have long believed that this sound-control mechanism only helps to amplify quiet sounds or dampen loud ones.

But in 2018, Groh and his team discovered that the same sound regulation mechanisms are also activated by eye movements, indicating that the brain informs the ear about eye movements.

In their latest study, the research team followed up on their initial findings and investigated whether faint auditory signals contain detailed information about eye movements.

To decode the sounds of the human ear, the team of Groh at Duke University and Professor Christopher Schirra, Ph.D. From the University of Southern California, he recruited 16 adults with poor vision and hearing to Groh’s laboratory in Durham for fairly simple eye tests.

Participants viewed a stationary green dot on a computer screen and then, without moving their head, traced the dot with their eyes as it disappeared and reappeared up, down, left, right, or diagonally from the starting point. This gave Groh’s team a variety of auditory signals produced when the eyes move horizontally, vertically, or diagonally.

The eye tracker recorded the direction of the participant’s pupil movement for comparison with ear sounds, which were captured using a pair of earbuds attached to a microphone.

The research team analyzed ear sounds and found unique signatures for different directions of movement. This allows them to decode ear sounds and calculate where people are looking just by examining sound waves.

“Because diagonal eye movements are just a horizontal component and a vertical component, my lab partner and co-author David Murphy realized that you could take those two components and guess what would happen if you put them together,” Lovich said.

“Then you can go in the opposite direction and look at the oscillations to estimate that someone is looking at a 30 degree angle to the left.”

The pup is now starting to check whether these ear sounds play a role in perception.

One project group focused on differences in eye and ear movement sounds in people with hearing or vision impairments.

Groh also tested whether people without hearing or vision impairment would produce ear signals that could predict how well they would perform audio location tasks, such as locating an ambulance while driving, which relies on mapping auditory information to visual devices. View.

“Some people have signals that repeat themselves day after day, and you can measure them quickly,” Groh said. “You might expect these people to be particularly skilled at visual and auditory tasks compared to other people, because those tasks are more varied.”

Financing: Groh’s research was supported by a grant from the National Institutes of Health (NIDCD DC017532).

About visual and auditory neuroscience research news

author: And Wahhabi
source: Duke University
communication: Dan Vahaba – Duke University
picture: Image credited to Neuroscience News

Original search: Open access.
Parametric information about eye movements is sent to the ear“By Jennifer Groh et al. With people


summary

Parametric information about eye movements is sent to the ear

As the eyes move, the alignment between the visual and auditory scenes changes. We did not notice this transformation, which suggests that the brain must integrate precise information about eye movements into auditory and visual processing.

Here we show that small sounds produced by the brain in the ear contain precise information about eye movements at that moment in the spatial domain: the direction and amplitude of eye movements can be inferred from these small sounds.

The underlying mechanisms likely involve various motor structures of the ear and may facilitate the translation of incoming auditory signals into a reference frame established in the direction of the eye and the visual scene.

2023-11-22 23:12:27
#Ear #sounds #reveal #eye #movements

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.