Study explores how brain waves reflect melody predictions while listening to music

A, Snippet of the sets of regressors obtained from one piece of music used to calculate the TRFs. From top to bottom: acoustic envelope (Env), half-way rectified first derivative of the envelope (Env’); melodic information for note onset (Mo): onset entropy (Ho), onset surprisal (So); melodic information for note pitch (Mp): pitch entropy (Hp), and pitch surprisal (Sp). Based on Di Liberto, Pelofi, Bianco, et al. (2020). B, Snippet of pre-processed EEG bands of a single subject used for TRF analysis. Gray lines represent individual channels and black lines represent the average amplitude across electrodes for each frequency band. C, Schematic of the pipeline. Melodic information features were extracted with IDyOM (Di Liberto et al., 2021; Pearce, 2005). Separate multivariate temporal response function (mTRF) models were trained from the preprocessed EEG signal and the acoustic and melodic information features (see Materials and methods). The sets of features were convolved with the corresponding optimal TRFs to predict the EEG signals. The reconstruction accuracy is calculated as the Pearson correlation between the original filtered EEG and the predicted one. D, Hypothetical enhancement in EEG reconstruction accuracy of a model trained on acoustic and melodic information variables (AM) compared to a model trained only on acoustic regressors. Based on Di Liberto, Pelofi, Bianco, et al. (2020). E, Hypothetical unique contribution of an individual feature to the full model, measured as the difference in the Pearson correlation between the reconstruction accuracy obtained using the AM model and the same model missing one specific feature. Credit: European Journal of Neuroscience (2024). DOI: 10.1111/ejn.16581

The human brain is highly skilled at detecting patterns in the world and using this information to predict future events. This ability to anticipate events is also reflected in how we experience music, precisely in our ability to intuitively anticipate what will come next in a melody we are listening to.

Researchers at the Max Planck Institute for Human Cognitive and Brain Sciences have been trying to better understand the complex neural dynamics underpinning this ability to predict upcoming melodies in music. Their most recent paper, published in the European Journal of Neuroscience, shows that uncertainty (i.e., entropy) plays a key role in melody prediction, while also highlighting new differences in how the brains of musicians and non-musicians process music.

“Prior work on music cognition suggests that low-frequency (1–8 Hz) brain activity encodes melodic predictions beyond the stimulus acoustics,” wrote Juan-Daniel Galeano-Otálvaro, Jordi Martorell and their colleagues in their paper. “Building on this work, we aimed to disentangle the frequency-specific neural dynamics linked to melodic prediction uncertainty (modeled as entropy) and prediction error (modeled as surprisal) for temporal (note onset) and content (note pitch) information.”

For the purpose of their study, the researchers recruited 20 participants, half of whom were professional pianists. These participants were asked to listen to 10 piano melodies extracted from the work of Johann Sebastian Bach, each lasting approximately 150 seconds.

The participants listened to each of these melodies three times. While they were listening, the researchers collected recordings using electroencephalography (EEG), a technique that relies on electrodes to measure the brain’s electrical activity.

The team calculated the probability of specific notes following others in a sequence and this allowed them to compute the surprisal that segments of a melody would typically elicit. They then separately computed entropy, a measure of uncertainty, as the average of surprisal over all continuations in a note sequence.

Galeano-Otálvaro and his colleagues analyzed the participants’ EEG recordings using different software toolkits for scientific research, including MATLAB, the FieldTrip toolbox and the EEGLAB toolbox. Their analyses unveiled neural correlates of melody prediction while listening to music.

“Our results show that melodic expectation metrics improve the EEG reconstruction accuracy in all frequency bands below the gamma range (< 30 Hz),” wrote the researchers. “Crucially, we found that entropy contributed more strongly to the reconstruction accuracy enhancement compared to surprisal in all frequency bands. Additionally, we found that the encoding of temporal, but not content, information metrics was not limited to low frequencies, rather it extended to higher frequencies (> 8 Hz).”

Overall, the researchers found that uncertainty (i.e., entropy) in melody is encoded in neural responses that range across different frequency bands. The researchers were also able to pinpoint the frequencies that could encode uncertainty about the onset of notes before the participants heard a given note.

“An analysis of the temporal response function (TRF) weights revealed that the temporal predictability of a note (entropy of note onset) may be encoded in the delta- (1–4 Hz) and beta-band (12–30 Hz) brain activity prior to the stimulus, suggesting that these frequency bands associate with temporal predictions,” wrote the researchers.

“Strikingly, we also revealed that melodic expectations selectively enhanced EEG reconstruction accuracy in the beta band for musicians, and in the alpha band (8–12 Hz) for non-musicians, suggesting that musical expertise influences the neural dynamics underlying predictive processing in music cognition.”

In addition to gathering insight about the brain activity underpinning melody prediction while listening to music, the researchers observed differences between the electrical activity in the brain of professional pianists and non-musicians. Specifically, they found that melody predictions correlated with more brain waves in the beta frequency band for pianists and in the alpha band for non-musicians

The team’s interesting results could soon pave the way for new studies exploring the neural correlates of melody prediction or the differences observed between musicians and non-musicians. These works could further enrich the present understanding of how the human brain processes music and predicts upcoming events.

More information:
Juan-Daniel Galeano-Otálvaro et al, Neural encoding of melodic expectations in music across EEG frequency bands. European Journal of Neuroscience(2024). DOI: 10.1111/ejn.16581

© 2024 Science X Network

Citation:
Study explores how brain waves reflect melody predictions while listening to music (2024, November 14)
retrieved 14 November 2024
from https://medicalxpress.com/news/2024-11-explores-brain-melody-music.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.