Attention in waves
How well we can listen to the speech of another person in noisy surroundings depends on our ability to adapt our alpha brain waves to the rhythm of their language.
It is often hard for us to listen carefully, especially when background noise and other conversations are going on around us. Scientists from the Max Planck Institute for Cognitive and Brain Sciences (MPI CBS) in Leipzig and the University of Lübeck have now discovered what determines how well we are able to listen: The more the so-called alpha waves in our brain oscillate to the rhythm of the speech to which we want to listen, the better we are at concentrating on the other person’s speech.
An announcement on the platform, the screeching of the approaching train, the snippets of conversation: In most everyday situations we are exposed to several sounds whereby just a few of them are important to us. When we pay attention to just one of the noise sources, we can extract the important information while blocking out the rest.
The alpha waves in our brain that reflect our listening effort are closely linked to such attentional processes. Neuroscientists from the Auditory Cognition research group at MPI CBS and the University of Lübeck examined how the modulation of theses waves enables us to listen despite distraction.
“The amplitude of the alpha waves increases on the side of brain, depending on which ear is exposed to the specific focus of attention”, says study leader Malte Wöstmann. “The differences between the alpha wave amplitudes of the right and the left hemispheres of the brain tells us whether the listener directs his attention to the right or to the left.”
Thus, the difference in the alpha activity between both hemispheres of the brain does not appear to be constant during exposure to both important and unimportant information. Instead, the alpha activity changes depending on the rhythm of the language to which we want to listen to. Accordingly, every spoken word first triggers the expected response in the auditory cortex followed by a significant difference between the alpha activity in the left or the right hemisphere of the brain half a second later. “In noisy environments our brain oscillates between two states – the progress of the acoustic information on the one hand and the selective attention on the other”, explains Jonas Obleser, leader of the research group Auditory Cognition.
Interestingly, the oscillations of the alpha waves do not just occur in the well-known areas of the parietal lobe, processing attention, but also in the auditory cortex, which processes the acoustic stimuli.
But what does rhythmic oscillation of alpha activity mean for hearing? To answer this question, scientists exposed participants to a difficult auditory situation: Via headphones they simultaneously played different numbers to their left and right ear. At the beginning of each test, a beep in one ear signalled to the participant to which side they should direct their attention. It was shown that the higher the amplitude of alpha activity, the better a person’s listening ability, and they could therefore repeat more numbers correctly.
These relationships were examined by using magnetoencephalography, which allows us to see brain waves and their characteristic alpha waves by measuring tiny changes in voltage on the scalp.
“Now we would like to investigate how these brain processes behave in middle-aged and older people when, as it is well known, listening problems start to emerge”, states Obleser when talking about a future study by his research group.