Language is more than speaking: How the brain processes sign language  

February 16, 2021
Over 70 million deaf people around the world use one of more than 200 different sign languages as their preferred form of communication. Although they access similar structures in the brain as spoken languages, it has been difficult to identify the brain regions that process both forms of language equally. Scientists at the Max Planck Institute for Human Cognitive and Brain Sciences (MPI CBS) have now discovered in a meta-analysis that Broca's area in the left hemisphere of the brain, which has already been shown to be the central hub for spoken languages, is also the crucial brain region for sign languages. This is where the grammar and meaning of language are processed, regardless of whether it is spoken or signed language. This shows that our brain is generally specialized in processing linguistic information. Whether this information is spoken or signed seems to be of secondary importance.

The ability to speak is one of the essential characteristics that distinguishes humans from other animals. Many people would probably intuitively equate speech and language. However, cognitive science research on sign languages since the 1960s paints a different picture: Today it is clear, sign languages are fully autonomous languages and have a complex organization on several linguistic levels such as grammar and meaning. Previous studies on the processing of sign language in the human brain had already found some similarities and also differences between sign languages and spoken languages. Until now, however, it has been difficult to derive a consistent picture of how both forms of language are processed in the brain.

Researchers at the MPI CBS now wanted to know which brain regions are actually involved in the processing of sign language across different studies - and how large the overlap is with brain regions that hearing people use for spoken language processing. In a meta-study recently published in the journal Human Brain Mapping, they pooled data from sign language processing experiments conducted around the world. "A meta-study gives us the opportunity to get an overall picture of the neural basis of sign language. So, for the first time, we were able to statistically and robustly identify the brain regions that were involved in sign language processing across all studies," explains Emiliano Zaccarella, last author of the paper and group leader in the Department of Neuropsychology at the MPI CBS.

The researchers found that especially the so-called Broca's area in the frontal brain of the left hemisphere is one of the regions that was involved in the processing of sign language in almost every study evaluated. This brain region has long been known to play a central role in spoken language, where it is used for grammar and meaning. In order to better classify their results from the current meta-study, the scientists compared their findings with a database containing several thousand studies with brain scans.

 The Leipzig-based researchers were indeed able to confirm that there is an overlap between spoken and signed language in Broca's area. They also succeeded in showing the role played by the right frontal brain - the counterpart to Broca's area on the left side of the brain. This also appeared repeatedly in many of the sign language studies evaluated, because it processes non-linguistic aspects such as spatial or social information of its counterpart. This means that movements of the hands, face and body - of which signs consist - are in principle perceived similarly by deaf and hearing people. Only in the case of deaf people, however, do they additionally activate the language network in the left hemisphere of the brain, including Broca's area. They therefore perceive the gestures as gestures with linguistic content - instead of as pure movement sequences, as would be the case with hearing people.

The results demonstrate that Broca's area in the left hemisphere is a central node in the language network of the human brain. Depending on whether people use language in the form of signs, sounds or writing, it works together with other networks. Broca's area thus processes not only spoken and written language, as has been known up to now, but also abstract linguistic information in any form of language in general. "The brain is therefore specialized in language per se, not in speaking," explains Patrick C. Trettenbrein, first author of the publication and doctoral student at the MPI CBS. In a follow-up study, the research team now aims to find out whether the different parts of Broca's area are also specialized in either the meaning or the grammar of sign language in deaf people, similar to hearing people.

Would you like to support our research as a study participant? You can find more information here.

For deaf people: Language is more than speaking - How the brain processes sign language

https://www.youtube.com/watch?v=vHUxfTF1Ps0

Other Interesting Articles

Go to Editor View