The relevance of our work

Given that we live in a world in which we are continually bombarded with information provided by our different sensory systems, such "multisensory integration" is a ubiquitous phenomenon. The utility of multisensory interactions is illustrated by the numerous studies from our lab and others that have highlighted the important role these processes play in altering our behaviors and shaping our perceptions. In addition, our lab (along with others) are beginning to highlight the important role altered multisensory function plays in clinical conditions such as autism and schizophrenia. The following video highlights some of this work in children with autism. 

Impact through multidisciplinary research

Ultimately, we are interested in providing a more complete understanding of how multisensory processes impact our behaviors and perceptions, in better elucidating the neural substrates for these interactions, and in understanding how multisensory processes develop and are influenced by sensory experience. We study these fundamental questions using a multidisciplinary set of approaches, including animal behavior, human psychophysics, neuroimaging (ERP and fMRI) and neurophysiological techniques. Along with our interest in the brain bases for multisensory processes under normal circumstances, we are also interested in examining how multisensory circuits are altered in an array of clinical conditions, including attention deficit hyperactivity disorder, autism spectrum disorder and developmental dyslexia.

Current research projects

Navigate: Jacob Feldman | David Tovar | Adriana Schoenhaut | Sarah Vassall | Ansley Kunnath

Click HERE to enlarge image

Neurocomputational Modeling

David Tovar

The goal of my research is to uncover the neural computations that transforms incoming sensory information up the neural hierarchy to when and where it can it be used to guide behavior. My research spans between different levels of analysis from sensory processing within brain circuits, between circuits in different brain areas, and relating brain activity to computational models, such as convolutional neural networks. I analyze monkey neural spikes, local field potentials, current source density, as well as human EEG, MEG, and fMRI.

Click HERE to enlarge image

Cross-Network Multisensory Motion Processing

Adriana Schoenhaut

To effectively make decisions in a dynamic, noisy environment, you need to optimally use all the information available to you. In order to do so, you need to Integrate bottom-up sensory information, which is modulated by top-down attentional processes. My goal is to understand how dynamic top-down and bottom-up unisensory and multisensory information interacts and converges across different levels of processing through neurophysiology recordings and various computational modeling approaches.

For my main project, I plan to train non-human primates to perform a motion discrimination paradigm with auditory, visual, and combined audiovisual motion stimuli. I will simultaneously record from neurons in areas MT/MST, PPC, and dlPFC, which are each sensitive to different stimulus features (low-level features vs. higher order features). During the task, I will manipulate both low-level (e.g., stimulus motion coherence) and higher-level (e.g., attention, task) factors to expose the differential effects these changes have on representations of motion in each described area. These representational (as well as behavioral) dynamics will be revealed using computational modeling methods and representational similarity analysis (RSA). Using RSA, differences in the degree of correlation between models and neural data in each region will elucidate where and when different features play a critical role in sensory processing and multisensory integration.

Click HERE to enlarge image

Modeling Spatial Integration Changes in Autism

Sarah Vassall

Much of our social world is inherently multimodal, and optimal speech comprehension occurs when both auditory and visual signals are present. Importantly, autistic children are known to have poorer multisensory integration abilities, which is correlated with social communication symptoms. However, audiovisual spatial integration - combining sights and sounds with different spatial locations (e.g., tone of voice and facial expression) - has not been characterized in autism.

The goal of my research is to elucidate the relationship between audiovisual spatial integration abilities and clinical assessment scales in order to better understand how autistic children use space in their conceptualization of social interactions. Further, I aim to understand how electrophysiological differences contribute to differences in spatial representation, and whether these differences are predictive of spatial impairments.

Click HERE to enlarge image

Neural Plasticity in Sensory Deprivation and Restoration

Ansley Kunnath

I aim to explore the factors driving cortical reorganization and cross-modal plasticity following hearing loss and cochlear implantation in order to better predict individualized cochlear implant performance before surgery. I also plan to use this information to develop targeted approaches to improve hearing outcomes in cochlear implant users, involving functional neuroimaging studies (ie. fNIRS) and multisensory training paradigms.