The human brain is a bafflingly complicated organ that simultaneously controls a myriad of bodily functions. One of the most important social roles it helps us fulfill is the production and understanding of language.
It's been known for a long time that the majority of language processing — whether it's spoken, written or signed — is done in the left hemisphere of the brain. Research published today in the journal Current Biology, however, explains a rare exception to this rule: whistled Turkish.
Whistled Turkish, unsurprisingly, is language traditionally used in the mountainous regions of Turkey. To communicate across the wide, steep-sided valleys, villages like Kuşköy adapted their spoken language centuries ago into a series of whistles that imitate the pitch and tonality of normal speech. Before the advent of telephones, such a system was invaluable for long-distance conversation. While whistled-Turkish speakers use spoken Turkish at close range, they switch to the whistled form when at a distance of around 50 to 90 meters away.
Although low frequency sounds can travel further, high frequencies are much more directional and the human ear is more sensitive to them. As a result, whistled Turkish can carry much further than a shout or other low frequency communication, especially across an open space unhindered by buildings or trees. "If you look at the topography, it is clear how handy whistled communication is," says Onur Güntürkün, professor of behavioral neuroscience at Ruhr-University Bochum in Germany. "You can't articulate as loud as you can whistle, so whistled language can be heard kilometers away across steep canyons and high mountains."
Whistled Turkish isn't a distinct language from Turkish, Güntürkün explains. It is Turkish converted into a different form — in the same way that the text of this article is English converted into a written form. Güntürkün, who is Turkish, says that he still found the language surprisingly difficult to understand: "As a native Turkish-speaking person, I was struck that I did not understand a single word when these guys started whistling," he says. "Not one word! After about a week, I started recognizing a few words, but only if I knew the context."
While whistled Turkish is fascinating in its own right, Güntürkün and his colleagues also realized that it would allow them to closely examine the idea that language is a predominantly a left-brained activity. That's because, while the understanding of language happens in the left half of the brain — in a section called Wernicke's area — the processing of things like frequency, pitch, and melody —the music of which whistles are made — is generally considered a job for the right.
The researchers looked at the brain asymmetry in the processing of spoken versus whistled Turkish by presenting whistled-Turkish speakers with speech sounds delivered to their left or right ears through headphones. The participants then reported what they'd heard. While individuals more often perceived spoken syllables when presented to the right ear, they heard whistled sounds equally well on both sides.
"We could show that whistled Turkish creates a balanced contribution of the hemispheres," Güntürkün says. "The left hemisphere is involved since whistled Turkish is a language, but the right hemisphere is equally involved since for this strange language all auditory specializations of this hemisphere are needed." Indeed, it has been suggested that the processing of music is not as straightforward as once thought. Far from the left-brain-analytical-right-brain-artistic model once thought to exist, the way your brain deals with music might have a lot to do with whether you are a professional performer or a casual listener.
That's an idea that seems to be backed up by the new research. The researchers say that their findings are important because they show how the left-hemispheric dominance in language does depend on the physical structure (that is, spoken, written or whistled) that the language takes. The next step, they say, is to form a clearer picture of the underlying brain processes at play when whistled Turkish is used by looking more closely at the brain with EEG studies.
A 2014 study at John Hopkins Medicine was just the latest to suggest that spoken music and language use similar systems. Dr. Charles Limb was involved with the study which looked at musical improvisation between performers — arguably, the same situation as in a spontaneous conversation. Limb says, "Specifically, it's syntactic and not semantic processing that is key to this type of musical communication." In other words, because the patterns of whistled Turkish are the same as those of the spoken language, the hearer understands what is being "said" despite not having actual words to decipher. The patterns of whistling are familiar despite being translated into a new form.
"We are unbelievably lucky that such a language indeed exists," says Dr. Güntürkün. "It is a true experiment of nature."
Click play to check out the video:
Short scene of two men whistling across a valley of about 700 meters in Kuşköy, Turkey. After exchanging greetings, the closer person asks the distant fellow if he will later on come to the café. The distant person promises to do so. Then Onur Güntürkün asks the close whistler to transmit his greetings. As a response, the distant fellow greets back and asks when the "teacher" (O.G.) will leave the village. When being told (whistled) that departure is planned for the next day, he wishes a smooth journey. Finally, the person close to the camera also transmits best wishes of Osman, a person standing close. The distant person greets back, also in the name of his wife Nazmiye, who joined him on the terrace of his house. Video by Onur Güntürkün.