How can humans understand speech and localize sound in everyday listening environments?
The research in our lab focuses on the ability of humans to function in complex auditory environments, such as classrooms, restaurants, playgrounds and “cocktail parties”. To understand how the brain determines the location and the content of important sounds, we study hearing in adults and in children with normal hearing, as well as individuals who had impaired hearing and received cochlear implants (CIs). In the laboratory environment we simulate aspects of “real world” listening situations. For example, can we ignore irrelevant sounds more easily if they are spatially separated from the speech we want to hear? Some of our newer work focuses on the integration of facial (visual) cues and the sounds we hear.
Our methods include behavioral tasks, eye tracking, pupillometry and functional near infrared spectroscopy to examine neural signatures for auditory processing and cognitive load.
Adults who are deaf and use bilateral cochlear implants
We find significant benefits from the use of two implants compared with one implant: improved speech understanding in the presence of competing speech, improved localization of sounds, and reduced listening effort. However, the benefits seen across patients vary greatly. A known problem is that bilateral CI users are essentially fitted with two separate monaural devices that are not synchronized. In addition, binaural benefits may be reduced because many CI users experience auditory deprivation prior to being implanted, leading to degeneration of auditory neurons. We study how the access to binaural synchrony is important by using customized research processors to bypass the clinical processors. The devices are engineered in such a way that they provide binaural cues directly to pairs of electrodes in the right and left ears, with precise synchronization. This experimental approach enables us to measure the extent to which bilateral CI users are sensitive to auditory cues that we know to be important for listeners with normal hearing.
Development of binaural hearing in typically-developing children
Children spend many hours a day in enclosed spaces and are known to be worse than adults at ignoring information from echoes and hearing speech in noise. We study what challenges they face in simulated complex environments using a “listening game.” The software platform allows the child to interact with the computer and receive reinforcement and feedback. We study the maturation of these abilities in typically-developing children and their relationship to non-auditory measures, such as speech production, phonological awareness, vocabulary acquisition, non-verbal IQ, and word learning.
Emergence of spatial hearing skills and non-auditory skills in children who use cochlear implants
A growing number of children world-wide have been implanted bilaterally, and more recently children with one normal hearing ear are being considered candidates for a CI in the deaf ear. We measure how well the children can discriminate sounds from right-vs-left, and also how well they achieve more complex sound localization abilities. We have found that factors such as age of implantation, amount of bilateral experience, and chronological age at the time of testing may be significant.
Auditory function, cognition, language and brain structure in Adults with Down Syndrome
Down syndrome (DS) is a leading known cause of intellectual disability and a highly recognized genetic syndrome that involves multiple medical co-morbidities; hearing deficits in DS are estimated to occur at a rate of 80-90% and are thought to be caused by a combination of structural and functional abnormalities in the external and/or inner ear. This project aims to tackle a timely and significant question regarding the role of hearing loss in DS on auditory function, cognition, language, and structural integrity of brain regions that are important for hearing, meeting the programmatic objectives of INvestigation of Co-occurring conditions across the Lifespan to Understand Down syndrome. Results will provided information regarding clinical diagnosis and intervention to facilitate treatment of hearing disorders in individuals with DS, and more long-term impact will be identification of measures that may be used to evaluate effectiveness of pharmaceutical and therapeutic trials.
Auditory function, cognition, and brain structure in Children with Down Syndrome
We are recruiting children ages 10-17 with Down Syndrome (DS), to study associations between hearing status, auditory function, cognition, language and brain imaging through functional and structural approaches. There is a paucity of auditory studies in this population, with known incidence of hearing loss in infants and children with being much higher than in the general population. This preliminary project will establish feasibility for larger projects on auditory function and its association with cognition, language and brain structure and function in children with DS.