Untreated hearing loss can have devastating and alienating repercussions on a person’s life: isolation, depression, sapped cognition, even dementia.
Yet only one in five Americans who could benefit from a hearing aid actually wears one. Some don’t seek help because their loss has been so gradual that they do not feel impaired. Others cannot afford the device. Many own hearing aids but leave them in a drawer. Wearing them is just too unpleasant.
“In a crowded place, it can be very difficult to follow a conversation even if you don’t have hearing deficits,” says UC Berkeley neuroscientist Frederic Theunissen. “That situation can be terrible for a person wearing a hearing aid, which amplifies everything.”
Imagine the chaotic din in which everything is equally amplified: your friend’s voice, the loud people a few tables over, and the baby crying across the room.
In that scenario, the friend’s voice is the signal, or sound that the listener is trying to hear. Tuning in to signal sounds, even with background noise, is something that healthy human brains and ears do remarkably well. The question for Theunissen — a professor who focuses on auditory perception — was how to make a hearing aid that processes sound the way the brain does.
“We were inspired by the biology of hearing,” Theunissen said. “How does the brain do it?”
Songbirds excel at listening in crowded, noisy environments
Humans aren’t the only ones able to hone in on specific sounds in noisy environments. For the past two years, Theunissen and the graduate students in his lab have studied songbirds, which are especially adept at listening in crowded, noisy environments.
By looking at songbird brain imagery, the researchers now understand how chatty, social animals distinguish the chirp of a mate from the din of dozens of other birds.
They were able to identify the exact neurons that tune into a signal and remain tuned there no matter how noisy the environment becomes. These neurons shine what Theunissen calls an “auditory spotlight” by focusing in on certain features or “edges” of a sound. Imagine you are looking for your cellphone on a table covered with objects. In the same way that your eye can find for a specific rectangular shape and color, your ear searches for and finds certain pitches and frequencies: the sound of a friend’s voice in a restaurant.
“Our brain does all this work, suppressing echoes and background noise, conducting auditory scene analysis,” Theunissen says.
A Proof of Concept Commercialization Gap grant from UC Research Initiatives in the Office of the President provided the critical funding the lab needed to take the discovery one giant step farther.