Researchers at Johns Hopkins have mapped the sound-processing part of the mouse brain in a way that keeps both the proverbial forest and the trees in view. Their imaging technique allows zooming in and out on views of brain activity within mice, and it enabled the team to watch brain cells light up as mice ‘called’ to each other. The results, which represent a step toward better understanding how our own brains process language, appear in the journal Neuron.
In the past, researchers often studied sound processing in various animal brains by poking tiny electrodes into the auditory cortex, the part of the brain that processes sound. They then played tones and observed the response of nearby neurons, laboriously repeating the process over a gridlike pattern to figure out where the active neurons were. The neurons seemed to be laid out in neatly organized bands, each responding to a different tone.
More recently, a technique called two-photon microscopy has allowed researchers to focus in on minute slices of the live mouse brain, observing activity in unprecedented detail. This newer approach has suggested that the well-manicured arrangement of bands might be an illusion. But, researchers could lose their way within the zoomed-in views afforded by two-photon microscopy and not know exactly where they are in the brain.
To get the bigger picture the lab used a mouse genetically engineered to produce a molecule that glows green in the presence of calcium. Since calcium levels rise in neurons when they become active, neurons in the mouse’s auditory cortex glow green when activated by various sounds. The team used a two-photon microscope to peer into the brains of live mice as they listened to sounds and saw which neurons lit up in response, piecing together a global map of a given mouse’s auditory cortex. With these mice, the team were able to both monitor the activity of individual populations of neurons and zoom out to see how those populations fit into a larger organizational picture.
With these advances the research team were able see the tidy tone bands identified in earlier electrode studies. In addition, the new imaging platform quickly revealed more sophisticated properties of the auditory cortex, particularly as mice listened to the chirps they use to communicate with each other. Understanding how sound representation is organized in the brain is ultimately very important for better treating hearing deficits. The team hope that mouse experiments like this can provide a basis for figuring out how human brains process language and, eventually, how to help people with cochlear implants and similar interventions hear better.
The researchers note that the same approach could also be used to understand other parts of the brain as they react to outside stimuli, such as the visual cortex and the parts of the brain responsible for processing stimuli from limbs.
Source: The Johns Hopkins University