Researchers at the University of East Anglia have made an important discovery about the way our brains process the sensations of sound and touch.
A new study published today shows how the brain’s different sensory systems are all closely interconnected—with regions that respond to touch also involved when we listen to specific sounds associated with touching objects.
They found that these areas of the brain can tell the difference between listening to sounds such as such as a ball bouncing, or the sound of typing on a keyboard.
It is hoped that understanding this key area of brain function may in future help people who are neurodiverse, or with conditions such as schizophrenia or anxiety. And it could lead to developments in brain-inspired computing and AI.
Lead researcher Dr. Fraser Smith, from UEA’s School of Psychology, says that “we know that when we hear a familiar sound such as a bouncing a ball, this leads us to expect to see a particular object. But what we have found is that it also leads the brain to represent what it might feel like to touch and interact with that object.”
“These expectations can help the brain process sensory information more efficiently.”
The research team used an MRI scanner to collect brain imaging data while 10 participants listened to sounds generated by interacting with objects—such as bouncing a ball, knocking on a door, crushing paper, or typing on a keyboard.
Using a special imaging technique called functional MRI (fMRI), they measured brain activity throughout the brain.
They used sophisticated machine learning analysis techniques to test whether the activity generated in the earliest touch areas of the brain (primary somatosensory cortex) could tell apart sounds generated by different types of object interaction (bouncing a ball, verses typing on a keyboard).
They also performed a similar analysis for control sounds, similar to those used in hearing tests, to rule out that just any sounds can be discriminated in this brain region.
Researcher Dr. Kerri Bailey says that their “research shows that parts of our brains, which were thought to only respond when we touch objects, are also involved when we listen to specific sounds associated with touching objects.”
“This supports the idea that a key role of these brain areas is to predict what we might experience next, from whatever sensory stream is currently available.”
Dr. Smith added that their “findings challenge how neuroscientists traditionally understand the workings of sensory brain areas and demonstrate that the brain’s different sensory systems are actually all very interconnected.”
“Our assumption is that the sounds provide predictions to help our future interaction with objects, in line with a key theory of brain function—called Predictive Processing.”
“Understanding this key mechanism of brain function may provide compelling insights into mental health conditions such as schizophrenia, autism or anxiety and in addition, lead to developments in brain-inspired computing and AI.”
This study was led by UEA, in collaboration with researchers at Aix-Marseille University (France) and Maastricht University (Netherlands).
Source: Read Full Article