Both your hearing and your vestibular sense are received through the same organ: the middle and inner ear. As you might remember from our post about the vestibular system, the vestibular sense is our sense of head position, and it contributes to our sense of balance. Because both senses are received through the same organ, they’re closely tied—a fact that our ReceptorBased® rehabilitation specialists use to help our patients.
Your sense of hearing begins with the eardrum, a membrane that separates the outer ear from the middle ear. It transmits sound waves into the inner ear to the cochlea, turning the sound into electrical signals. These signals then travel to the vestibulocochlear nerve—so named because vestibular signals and sound travel along the same nerve.
Now, you might be wondering:
“If the vestibular and auditory signals travel along the same nerve, how do they not get mixed up?”
Well, the nerve transmits signals to two different bundles of neurons (or nuclei): the vestibular nucleus and the cochlear nucleus. However, the vestibular signals and your hearing do often mix! You’ve likely witnessed this phenomenon at a loud concert, or when someone accidentally turns up the volume too loud. Most people’s immediate response to a loud noise is to move their head away from the sound.
Hearing is so closely tied to vestibular sense that it can affect a person’s posture. Clinical studies have found that a decrease in a person’s hearing function causes them to deviate toward their less functional ear. Losing hearing in your right side, for instance, would potentially cause you to lean or fall toward your right side.
ReceptorBased® rehabilitation specialists can use this knowledge to correct poor posture from neurological issues. Our specialists can stimulate hearing on the side a client is leaning, causing them to respond by deviating away—thus helping correct the issue. This is only one of many ways that we use sound to create better brain performance.
Hearing is a special sense—it’s the only sense where stimuli is represented in both sides of the brain. Fibers connect both sides of the brain to receive audio signals, likely to help the brain create a stereo image of where sound is coming from in 3D space. In other words, the way the brain receives sound signals allows for a type of “echolocation.”
Let’s illustrate it:
Imagine you’re walking across the street. You’re looking down at your phone, but a car suddenly honks its horn. You immediately look to your right. But how do you know that’s where the sound came from?
The way that hearing functions in the brain means your ears will process the same sound at slightly different times, depending on which ear received the stimulus first. This creates a slight delay. The brain can then interpret this delay in order to determine where the sound came from.
In a survival situation, the brain’s ability to respond to where a sound is coming from is vital. Imagine a world where you wouldn’t be able to pinpoint where a car horn was coming from, or from what direction an ambulance was approaching. Sound has to be represented in both sides of the brain.
Some research suggests, however, that sound is not processed on a 50/50 basis on either side of the brain—it appears to be a 60/40 split. For example, 60% of processing for the right ear will take place in the left side of the brain, and vice versa.
Why does that matter?
Let’s say that our clinicians found that one side of the brain was not as functional as the other side. Our specialists would be able to play sound in one ear in order to stimulate the side of the brain that’s weaker. That’s only one of the ways we can use our brain’s hearing sense to address neurological issues.
Not all sound is processed the same. Research has found that different wavelengths, like notes on a piano, are represented at different areas of the brain. For instance, G, A, and B notes stimulate the left side of the brain more than the right side. The opposite is true for D, E, and F notes. C is represented in both halves equally.
This has exciting applications in ReceptorBased® rehabilitative therapy: we can use different notes in order to stimulate one area of the brain over another. The applications of sound therapy are unique as well. If you remember from our post on the temporal lobe, the temporal lobe is where our brain receives hearing, vestibular information, and smell. It’s also where the brain processes memory and emotion.
Because sound, memory, and emotionality are all processed in the same region, ReceptorBased® therapy specialists can use sound therapy to help people with memory issues, cognition problems, and emotional processing. It can even address neurological disorders that affect smell and vestibular input.
There are a few neurological disorders that can affect a person’s sense of hearing. For example, hyperacusis is a condition where a person is hyper-sensitive to sound. However, it’s an issue of the brain rather than the ears. Most often, hyperacusis is caused by an issue in the processing of auditory information.
Another common disorder is tinnitus, which is often caused by brain injury. Tinnitus is a condition where a person hears sounds that do not exist. This occurs thanks to an overactive temporal lobe, causing it to process more information than it is actually receiving.
As you can see, your sense of hearing is a significant part of your ability to survive. For our ReceptorBased® rehabilitative therapists, your sense of hearing is also a powerful tool to strengthen your brain’s function, improve balance, correct posture, and strengthen cognition and memory.
If you or your patient is dealing with neurological issues like the ones mentioned above and has not been responding to traditional therapies, turn to Plasticity Brain Centers. Our leading neurological center in Orlando utilizes cutting-edge therapies taught by the Carrick Institute, restoring and improving healthy brain performance to clients all over the nation.
Plasticity Centers ©