How does the brain easily process such an abundant amount of information that is continuously received from the sensory organs? What strategies are used in combining the information into a coherent percept of our environment, and how is relevant information processed amidst irrelevant sensations?
The technical mechanisms of multi-sensory processing are of great interest to numerous disciplines, ranging from information theory and channel coding, to artificial intelligence and machine learning. Taking a page from David Marr's book on Vision, I am interested in the computational theory behind the logic of which multi-sensory integration is carried out. Given the stochastic nature of human subjects' perception to identically presented stimuli, and electrophysiological properties of individual neurons, I find the process is best suited to be studied within a probabilistic framework.
We are exploring the general theoretical rules governing the integration of information from different sensory modalities by comparing data obtained from psychophysical experiments on human subjects to statistical inference models. Recent work has shown that human multi-sensory integration and segregation tactics closely resemble a Bayesian ideal observer model. Currently, I am interested in Bayesian unsupervised learning strategies that the brain might utilize for cross-sensory recalibration. By investigating correlations between model parameters and electroencephalography (EEG) readings, we intend to seek neural correlates of where such computations might take place.
My career path at UCLA has been an interesting and challenging journey. The NeuroEngineering Training Program has provided a strong background in the principles of neuroscience, and the Neuroimaging Training Program (NITP) has provided a strong background in neuroimaging principles and experience with functional Magnetic Resonance Imaging (fMRI) and EEG methodologies. We have much to learn from the computational power
of the brain. Uncovering the basic computational strategies of the human nervous system opens the doors to better development or rehabilitation strategies with neural prosthetics, unsupervised learning algorithms that mimic human behavior, advanced autonomous machines that could adaptively learn and interact with hazardous environments, and the ability to utilize neural activity as controllers for Brain Machine Interfaces (BMI).