How can humans understand speech and localize sound in everyday listening environments?
The research in our lab focuses on the ability of humans to function in complex auditory environments, such as classrooms, restaurants, playgrounds and “cocktail parties”. To understand how the brain determines the location and the content of important sounds, we study hearing in adults and in children with normal hearing, as well as individuals who had impaired hearing and received cochlear implants (CIs). In the laboratory environment we simulate aspects of “real world” listening situations. For example, can we ignore irrelevant sounds more easily if they are spatially separated from the speech we want to hear? Some of our newer work focuses on the integration of facial (visual) cues and the sounds we hear.
Our methods include behavioral tasks, eye tracking, pupillometry and functional near infrared spectroscopy to examine neural signatures for auditory processing and cognitive load.
Adults who are deaf and use bilateral cochlear implants
We find significant benefits from the use of two implants compared with one implant: improved speech understanding in the presence of competing speech, improved localization of sounds, and reduced listening effort. However, the benefits seen across patients vary greatly. A known problem is that bilateral CI users are essentially fitted with two separate monaural devices that are not synchronized. In addition, binaural benefits may be reduced because many CI users experience auditory deprivation prior to being implanted, leading to degeneration of auditory neurons. We study how the access to binaural synchrony is important by using customized research processors to bypass the clinical processors. The devices are engineered in such a way that they provide binaural cues directly to pairs of electrodes in the right and left ears, with precise synchronization. This experimental approach enables us to measure the extent to which bilateral CI users are sensitive to auditory cues that we know to be important for listeners with normal hearing.
Development of binaural hearing in typically-developing children
Children spend many hours a day in enclosed spaces and are known to be worse than adults at ignoring information from echoes and hearing speech in noise. We study what challenges they face in simulated complex environments using a “listening game.” The software platform allows the child to interact with the computer and receive reinforcement and feedback. We study the maturation of these abilities in typically-developing children and their relationship to non-auditory measures, such as speech production, phonological awareness, vocabulary acquisition, non-verbal IQ, and word learning.
Emergence of spatial hearing skills and non-auditory skills in children who use cochlear implants
A growing number of children world-wide have been implanted bilaterally, and more recently children with one normal hearing ear are being considered candidates for a CI in the deaf ear. We measure how well the children can discriminate sounds from right-vs-left, and also how well they achieve more complex sound localization abilities. We have found that factors such as age of implantation, amount of bilateral experience, and chronological age at the time of testing may be significant.