Skip to main content
Research Discovery

The sensory code: Shape and texture discrimination in the cortex

By January 25, 2018February 16th, 2018No Comments

Each blue line shows spiking of one cell (n=256) at different positions of the whisker on along a tactile grating (10 mm spatial period; shown on the horizon). By Brian Isett, adapted from data in Isett et al. Figure 6H.

One major open question in neuroscience is “How is sensory experience encoded by neurons in the brain?” Despite decades of research on this topic, our understanding of sensory encoding is only now beginning to yield answers that apply to real world sensory experience.

Berkeley Neuroscience labs are addressing this question across multiple scales, animal models, and technical approaches. For example, the Gallant Lab uses fMRI studies to precisely map responses to natural visual scenes (movie clips) and language (podcasts) in the human brain, while the Scott Lab uses genetic and optical tools to identify the neural circuits that link taste sensation and behavior in fruit flies.

In a study published this month in Neuron, the Feldman Lab has taken our understanding of sensory encoding farther by looking at how neurons in the brain’s cerebral cortex encode the shape and texture of objects. They found that shape and texture were encoded by the same neurons on different time scales – shape was instantaneously encoded as a brief burst of neuron firing, and texture was encoded by changes in average neuron firing rate over time.

This finding reveals how the brain multiplexes different types of sensory information in the same neuronal firing patterns, providing an important clue to the nature of the neural code and neural computation.

First author and PhD Program alum (2010-2017) Brian Isett helps us understand the importance of this research discovery.

Georgeann Sack: What do you think was the most interesting outcome of this study?

Brian Isett: Many people know that rodents have long whiskers on their snout, but what they actually do with those whiskers is not well understood. We were curious if whiskers could capture tactile patterns, like shape, during active exploration. For me, the most interesting finding was that we could see the shape of the tactile patterns presented to the mouse by looking at where  neurons in somatosensory cortex responded. Neural responses locked to leading edges of tactile gratings, resulting in a loose outline of these shapes. Understanding how  spatially precise inputs are used by the brain could form a very important window into how brains generally build accurate pictures of the world.

The experiment: Multi-site linear silicon probes were placed in the somatosensory cortex to record neuron firing as the mouse actively explored its surroundings with its whiskers. The mouse is on a 1D virtual track that responds to mouse locomotion in order to naturalistically deliver tactile shapes and textures in a controlled environment. Video is slowed down 10x. By Brian Isett, originally published on his blog, Body Electric.

GS: What previous research inspired your own?

BI: I was inspired by classic research showing that neurons in somatosensory cortex code precise tactile shapes in primates (Phillips et al. 1988) and by work done by previous graduate students in Dan Feldman’s lab, Jason Wolfe and Shantanu Jadhav (Nature Neuroscience 2009). Jason and Shantanu identified fast whisker vibrations called “stick-slips” that occurred when rats whisked on sandpaper, and this led them to hypothesize that these events could be a powerful signal for coding texture. We ultimately combined and advanced several aspects of these studies to see if stick-slips could describe a similar code for shape in the mouse brain.

GS: Are there any open questions from your research that you would like someone with a different expertise to address?

BI: If we think stick-slips are important for encoding tactile shape and texture, it would be very cool to inhibit somatosensory cortex right after stick-slip events to see if this disrupts perceptions of the mouse during behavior. Someone with experience in optogenetic control of neurons might be able to perform that experiment.  I would also love to see if the spatial information present in the tactile input plays a role in building broader spatial mapping, for example in hippocampus, where neurons show strong spatial receptive fields. Someone with expertise in hippocampal measurements would be able to see how these whisker inputs are integrated with other spatial codes available in the brain.

by Georgeann Sack

Additional Information