Scientists Unravel the Meaning of a Pink Floyd Song through Analysis of Brain Activity

16 August 2023 2992
Share Tweet

In what seems like something out of a sci-fi movie, scientists have plucked the famous Pink Floyd song “Another Brick in the Wall” from individuals’ brains.

Using electrodes, computer models and brain scans, researchers previously have been able to decode and reconstruct individual words and entire thoughts from people’s brain activity.

The new study, published August 15 in PLOS Biology, adds music into the mix, showing that songs can also be decoded from brain activity and revealing how different brain areas pick up an array of acoustical elements. The finding could eventually help improve devices that allow communication from people with paralysis or other conditions that limit one’s ability to speak.

People listened to Pink Floyd’s “Another Brick in the Wall” song while having their brain activity monitored. Using that data and a computer model, researchers were able to reconstruct sounds that resemble the song.

To decode the song, neuroscientist Ludovic Bellier of the University of California, Berkeley and colleagues analyzed the brain activity recorded by electrodes implanted in the brains of 29 individuals with epilepsy. While in the hospital undergoing monitoring for the disorder, the individuals listened to the 1979 rock song.

People’s nerve cells, particularly those in auditory areas, responded to hearing the song, and the electrodes detected not only neural signals associated with words but also rhythm, harmony and other musical aspects, the team found. With that information, the researchers developed a computer model to reconstruct sounds from the brain activity data, and found that they could produce sounds that resemble the song.

“It’s a real tour de force,” says Robert Zatorre, a neuroscientist at McGill University in Montreal who was not involved in the study. “Because you’re recording the activity of neurons directly from the brain, you get very direct information about exactly what the patterns of activity are.”

The study highlights which parts of the brain respond to different elements of music. For example, activity in one area within the superior temporal gyrus, or STG, located in the lower middle of each side of the brain, intensified at the onset of specific sounds, such as when a guitar note played. Another area within the STG increased and kept its activity up when vocals were used.

The STG on the right side of the brain, but not the left, seemed to be crucial in decoding music. When the researchers removed information from that brain area in the computer model, it decreased the accuracy of the song reconstruction.

“Music is a core part of human experience,” says Bellier, who has been playing instruments since he was 6 years old. “Understanding how the brain processes music can really tell us about human nature. You can go to a country and not understand the language, but be able to enjoy the music.”

Continuing to probe musical perception is likely to be difficult because the brain areas that process it are hard to access without invasive methods. And Zatorre wonders about the broader application of the computer model, trained on just one song. “Does [it] work on other kinds of sounds, like a dog barking or phone ringing?” he asks.

The goal, Bellier says, is to eventually be able to decode and generate natural sounds in addition to music. In the shorter term, incorporating the more musical elements of speech, including pitch and timbre, into brain-computer devices could help individuals with brain lesions or paralysis or other conditions communicate better.

Our mission is to provide accurate, engaging news of science to the public. That mission has never been more important than it is today.

As a nonprofit news organization, we cannot do it without you.

Your support enables us to keep our content free and accessible to the next generation of scientists and engineers. Invest in quality science journalism by donating today.


RELATED ARTICLES