Echolocation in blind people reveals the brain’s adaptive powers

Authored by sciencemag.org and submitted by mvea

Echolocation in blind people reveals the brain’s adaptive powers

The brain has a way of repurposing unused real estate. When a sense like sight is missing, corresponding brain regions can adapt to process new input, including sound or touch. Now, a study of blind people who use echolocation—making clicks with their mouths to judge the location of objects when sound bounces back—reveals a degree of neural repurposing never before documented. The research shows that a brain area normally devoted to the earliest stages of visual processing can use the same organizing principles to interpret echoes as it would to interpret signals from the eye.

In sighted people, messages from the retina are relayed to a region at the back of the brain called the primary visual cortex. We know the layout of this brain region corresponds to the layout of physical space around us: Points that are next to each other in our environment project onto neighboring points on the retina and activate neighboring points in the primary visual cortex. In the new study, researchers wanted to know whether blind echolocators used this same type of spatial mapping in the primary visual cortex to process echoes.

The researchers asked blind and sighted people to listen to recordings of a clicking sound bouncing off an object placed at different locations in a room while they lay in a functional magnetic resonance imaging scanner. The researchers found that expert echolocators—unlike sighted people and blind people who don’t use echolocation—showed activation in the primary visual cortex similar to that of sighted people looking at visual stimuli.

literalstardust on October 6th, 2019 at 00:30 UTC »

Interestingly enough, dolphins (and other species that naturally echolocate) don't do this--their auditory regions light up, rather than their visual. This implies that when we echolocate, we're translating the received information into a visual language and interpreting it that way, but other animals actually interpret the information as sound, but like, in shapes. How wild is that???

EDIT: Totally forgot to source the study I'm referencing! It's linked below--they essentially took a ton of scans of dolphins post-mortem and collected a bunch of data on the connections between different areas (a DTI specfiically, for those who care) and studied the frequency of those connections to determine how much the regions "talked" to each other. One of the authors is my neuroscience professor and we're actually using the same data for research projects in a class I'm in now! Super excited

https://royalsocietypublishing.org/doi/full/10.1098/rspb.2015.1203

donthedog on October 6th, 2019 at 00:22 UTC »

I’m more impressed they put that in one sentence

AcidReignz_ on October 6th, 2019 at 00:06 UTC »

Was everyone here aware blind people could use echo-location?!