Presidential Scholars Research Symposium
How do our senses deal with uncertain and incomplete information? Why does music sometimes influence our emotions? Dig deeper into these two research questions on February 16 with the Presidential Scholars Research Symposium: 2nd Year Presentations. Presidential Scholars in Society and Neuroscience Raphael Gerraty and Matthew Sachs will be presenting on their cross-disciplinary research, with discussion from their faculty mentors in psychology, neuroscience, philosophy, and music. Before the event, we spoke with each scholar about their work.
Raphael Gerraty is studying the role of probability in how the brain recognizes visual stimuli and represents uncertainty. Through the use of artificial intelligence and philosophy, Gerraty’s research has focused on creating deep neural networks which use probabilistic inference to solve object recognition tasks, with plans to test out these theories on human volunteers in the future. During spring 2020, Gerraty co-taught the undergraduate course “Philosophy of Psychology” with mentor John Morrison, an Associate Professor of Philosophy at Barnard College.
The basic idea is that our brains’ representations of our environment are in some sense based on probability: that our brains have evolved to approximate the rules of probability in order to express uncertainty about things in our world.
Getting neural networks to represent their own uncertainty has been harder than I anticipated.
Artificial intelligence (AI) plays an essential role. There is currently an incredible amount of cross-talk between AI and neuroscience, to the extent that some of our most predictive theoretical models of the brain’s visual system were engineered in AI. For a core part of this project, we develop probabilistic deep neural networks to test as models of visual processing in the human brain.
Not being in the lab for a year has had a really big impact. I haven't been able to run any experiments [with human subjects], which has shifted the focus so far to working on the philosophy and theory sides of the project.
Hopefully we are timing the development of experiments right for things to start opening back up. So the next steps would be to collect some behavioral data and then soon after that to start brain imaging.
Matthew Sachs is investigating the complex feelings people have when listening to music, such as feelings of chills, sadness, and nostalgia, using functional magnetic resonance imaging (fMRI) to examine the brain during these emotional and physical experiences. This past November, Sachs organized and moderated How Music Moves Us: Exploring the Connection Between Music and Emotions, an event bringing together perspectives in neuroscience, artificial intelligence, computational science, and music composition.
Most of my research uses fMRI to intuit human neural activity associated with music listening. fMRI involves recording changes in levels of deoxygenated hemoglobin in the brain over time, which we interpret as a measure of brain functioning. Lots of regions of the brain show changes in activation in response to musical stimuli, so the difficult question that my work tries to understand is which aspects of our experience of listening to music do these brain responses correspond to. In previous papers I have shown that parts of the auditory cortex and insula show patterns of activity that represent the perceived emotion being conveyed in sounds, whether musical or vocal. I have also shown that co-activation of a network of midline cortical brain regions (often called the default mode network) changes in response to changes in our enjoyment of sad music; that is, when people felt that a piece of sad music is more enjoyable, activation in their default mode networks appeared to be aligned temporally.
Live music, and the social context that typically surrounds it, are great fodder for our emotions. We know that people are more likely to experience intense emotional responses, and in particular chills, when observing live music. The addition of seeing the intensity of a performer, mixed with the collective experience of being in an audience, can heighten and elevate the emotions that the music itself is able to induce. That being said, we can, and do, experience intense emotions to music when listening to recordings. Listening with a close friend or someone we care about can heighten our emotional responses and listening to recordings allows us the time and space to hear nuances in the music that might be missed at a live performance.
Music and emotions are both complicated and complex phenomena, meaning that it is always difficult to determine which aspects of the experience map onto which patterns of brain activation that we observe. Isolating individual aspects of the music listening experience and incorporating imaging techniques that have finer temporal resolution would help corroborate or refute some of my hypotheses. Furthermore, listening to music during fMRI is a very unnatural experience, so more research is needed to determine how the lab setting influences the influence and if similar findings are observed outside of the lab.
Join Raphael Gerraty, Matthew Sachs, and their faculty mentors for the Presidential Scholars Research Symposium: 2nd Year Presentations on February 16 at 4:30 PM ET on Zoom. This event is free and open to the public, RSVP is required via Eventbrite.
This event is hosted by Presidential Scholars in Society and Neuroscience.