In order to find a causal relationship between neural activity and physical activities, neuroscientists usually record animals’ behavior and their brain activity in a controlled environment. Then they manually annotate the behavioral/physical activity and neural activity data. It is an inefficient, time-consuming process that is subjective and conducive to human error, as it depends on who is recording the observations and therefore is not reproducible.
In recent years, there has been a growing trend toward automated processing of this data to improve efficiency and reproducibility. This is precisely the approach that the researcher Waseem Abbas has proposed in his thesis as part of the UOC’s doctoral program in Network and Information Technologies. Part of the research has already been published in three scientific journals: Journal of Neuroscience Methods, Sensors and IEEE Access.
The thesis proposes solutions based on deep learning for processing the neural activity data and behavioral data observed in head-fixed mice. The goal is to enable neuroscientists to annotate the behavioral data and extract neural patterns in an automated manner and establish a causal link between the two. “We have proposed a deep learning-based path for gesture tracking that explicitly codes the temporal information that appears in the videos,” Abbas explained.
The researcher also analyzed the rodents’ neural images using genetically encoded fluorescent calcium indicators (GECI). “When a neuron is active, the GECI concentration changes inside the cell and this change can be seen under a fluorescent microscope,” he continued.
The scientist trained deep learning algorithms that he had developed to record automatically the mice’s limb movements appearing in the videos and also to detect all the active neurons in the neural activity images. Specifically, he designed them to take the space-time context into account at all times.
Interdisciplinary research
The thesis is an example of interdisciplinary collaboration, said David Masip, the thesis’ supervisor. “We collaborate with researchers in the field of neuroscience to help relate the neural connections, which are visible in vivo using calcium-based imaging, with joint movements,” explained Masip, director of the UOC Doctoral School, professor at the Faculty of Computer Science, Multimedia and Telecommunications and the Scene understanding and artificial intelligence lab (SUNAI) group’s principal investigator.
According to the scientist, the methodology that has been developed enables large volumes of data to be recorded. Videos of moving mice, on one hand, and brain data cubes, on the other hand, entailing a major automation exercise that has been successfully undertaken with the new algorithms.
Source: Read Full Article