Listen to a few minutes of John Coltrane and Stan Getz trading saxophone licks, and there’s no denying that music is a form of conversation: The two jazz legends riff on each other’s melodies and build to a cat-and-mouse climax that is basically the musical equivalent of Shakespearean repartee.
But how the brain processes musical discourse is not well-understood. Is music a language? If not, how can we still use it to communicate?
At Johns Hopkins University, a team of researchers recently came up with a way to study the neurological basis of musical exchange by watching the brain activity of jazz improvisors. In one way, they found, music is exactly like language, but in another, it’s not similar at all.
Because our brains don’t discriminate between music and language structurally, we may in fact understand the structures of all forms of communication in the same way.
The musicians, 11 male pianists, had to put up with a little less comfort than they’re accustomed to on the stage of a hip club. Each slid supine into an MRI machine with a custom built all-plastic keyboard on his lap and a pair of mirrors arranged overhead for him to see the keys. For 10 minutes, he was asked to jam with another musician in the room by trading fours—swapping solos every four bars of the beat—as the MRI machine recorded the sparks flying in their heads.
The results took a big step toward describing the complexity of music’s relationship to language. During the improvisations, the syntactic areas of players’ brains—that is, the areas that interpret the structure of sentences—were super active, as if the two players were speaking to each other. Meanwhile, the semantic areas of their brains—the parts that process language’s meaning—totally shut down. The brain regions that respond to musical and spoken conversation overlapped, in other words, but were not entirely the same.
This distinction has two big implications. First, because our brains don’t discriminate between music and language structurally, we may in fact understand the structures of all forms of communication in the same way. Secondly, the way we understand the actual meaning of a conversation, though—the message its lines deliver to us—appears to depend on the medium of communication.
“Meaning in music is fundamentally context-specific and imprecise, thereby differing wholly from meaning in natural language,” writes Charles Limb, the study’s lead author (who’s also an accomplished musician; check out his TED talk on the the science of improvisation). “This study underscores the need for a broader definition of musical semantics,” he concludes later.
Music is not a language, but it certainly doesn’t lack meaning, the study shows. We rely on something beyond language to fully comprehend it.