Re-interpreting the processing of music

In a joint editorial, researchers from the Max Planck Institute for Empirical Aesthetics, Aix-Marseille University, New York University and the University of Geneva call for a revision of existing theories on the neuronal mechanisms of temporal processing of music and language.

MEG scanner. Photo: NIMH (see below)/wikimedia commons,SMPV

The team presents a theoretical approach that comprehensively integrates the interaction of different brain regions into existing processing models. The approach makes it possible for the first time to explain both periodic and aperiodic temporal predictions.

Recognizing time structures and predicting the timing of a signal are fundamental abilities of the human brain. They are even more essential for understanding speech or processing music, which go beyond mere hearing. Previous research has mainly focused on the neuronal mechanisms that allow us to process periodic, i.e. regularly recurring, signals and make temporal predictions based on them. Until now, it was assumed that this happens through neuronal oscillations that follow a repetitive signal in the brain.

Current research findings show that the human brain is also capable of making aperiodic, i.e. non-regular, temporal predictions. This fact cannot be adequately explained by oscillation theory alone. By means of aperiodic predictions, the brain is able, for example, to estimate the course of a movement or assess the dynamics of a conversation. This suggests that aperiodic processes are no less crucial for finding our way in everyday life than periodic processes and should therefore be thoroughly researched - supported by a well-founded theoretical model.

Previous models are based on the assumption of different mechanisms for periodic and aperiodic predictions: on the one hand stimulus-driven, i.e. oscillations caused by a signal, and on the other hand a mechanism guided by our memory. In contrast, current research shows that neuronal oscillations in the brain are influenced by higher-level processing stages that also include aperiodic processes. Several areas of the brain are active at the same time. "It is likely that there is a unified mechanism that is based on neuronal oscillations but is not purely stimulus-driven," explains lead author Johanna Rimmele from the Max Planck Institute for Empirical Aesthetics. "Oscillations still seem to play a central role in the processes of neuronal processing - but temporal predictions can only be comprehensively explained by a more complex view of several corresponding brain areas," the neuroscientist continues.

The editorial concludes with open questions about possible research approaches that build on the model. The aim must be to further refine the new findings at a theoretical level and to substantiate them experimentally.

Original publication:
Rimmele, J. M., Morillon, B., Poeppel, D., & Arnal, L. H. (2018). Proactive sensing of periodic and aperiodic auditory patterns. Trends in Cognitive Sciences. https://doi.org/10.1016/j.tics.2018.08.003

 

Photo: National Institute of Mental Health, National Institutes of Health, Department of Health and Human Services 

 

Das könnte Sie auch interessieren