Imagine driving on a dark road. In the distance you see a single light. As the light approaches it splits into two headlights. That’s a car, not a motorcycle, your brain tells you.
A new study found that neural circuits in the brain rapidly multitask between detecting and discriminating sensory input, such as headlights in the distance. That’s different from how electronic circuits work, where one circuit performs a very specific task. The brain, the study found, is wired in way that allows a single pathway to perform multiple tasks.
“We showed that circuits in the brain change or adapt from situations when you need to detect something versus when you need to discriminate fine details,” said Garrett Stanley, an associate professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University, whose lab performed the research. “One of the things the brain is good at is doing multiple things. Engineers have trouble with that.”
The research findings were published online in the journal NEURON on March 5. The research was funded by the National Institutes of Health (NIH) and the National Science Foundation (NSF).
The distance at which a person can discern two headlights from a single light is controlled by the acuity of the body’s sensory pathway. For decades neuroscientists have assumed that the level of one’s acuity is controlled by the distance between areas in the brain that are triggered by the sensory input. If these two areas of the brain closely overlap, then two sensory inputs – two headlights in the distance – will appear as one, the thinking went. The new study, for the first time, used animal models and optical imaging to directly assess how acuity is controlled in the brain, and how acuity can adapt to the task at hand. One neuronal circuit can do different things and do them in a robust way, the study found.
“The general problem that is not well understood is how information about the outside world makes its way into our brain, into these patterns of electrical activity that ultimately let us perceive the outside world,” Stanley said. “This paper squarely goes after that link between what the brain is doing, how it’s activated and what that means for perception.”
Sensory information is encoded in the brain, much like gene sequences in DNA code for some physical representation. The brain has corresponding codes for when the visual pathway detects an object, like a coffee cup. There’s a representation in the brain to transform that input into sensation.
Researchers had yet to adequately quantify the link between discerning whether an object exists and discriminating finer details about what that object is, Stanley said.
“Surprisingly, we don’t understand neural coding problems very well, either in normal physiology or in disease states,” Stanley said. “I think it’s great to be an engineer that works on this because engineers tend to love and think about very complicated systems.”
To learn about the details of the brain’s acuity, the researchers studied an animal with a high level of acuity – the rat. Rats are nocturnal animals that use their whiskers to sense the outside world. Their whiskers are arranged in rows, and chunks of brain tissue correspond to those individual whiskers. That’s similar to how a human’s body surface is mapped onto the brain surface. When a rat’s whisker touches something, a specific part of the brain becomes activated. When a person’s finger touches something, a specific part of the brain becomes activated.
“When we image the brain, we can move a whisker on the side of the face and on the opposite side of the brain there’s a little hotspot that you can image in real time,” Stanley said.
The researchers deflected rats’ whiskers and then used optical imaging technology to observe the areas of the brain that were activated and measured the overlap between those areas. Rats were also trained to perform a specific task depending on which whisker was deflected.
The researchers found that pathways in the brain have the ability to switch between doing different kinds of tasks, such as detecting a sensory input and deciding what to do with that information.
“Same circuit, same cells, but doing something different in two different contexts,” Stanley said.
When engineers want a circuit to do something, they build a circuit specific for that task. When they want a circuit to do something else, they build a different circuit. But in the brain, a pathway adaptively changes between being good at detecting something to being good at discriminating something, the study showed.
“As an engineer, I can’t design a circuit that would do that,” Stanley said. “This is where the brain jumps out and says, ‘I’m better than you are at this.’”
Learning more about how circuits in the brain multitask could lead to a better understanding of disease, therapeutic applications or to potentially improving how the brain functions. Stanley said that down the road engineers might be able to experimentally manipulate brain circuits to perform a desired task.
“Can we make individuals better at doing something? Can we have them detect things more rapidly or discriminate between things with better acuity?” Stanley said. “Using modern techniques, we believe that we can actually influence the circuit and have it selectively grab one kind of information from the outside world versus another.”
This research is supported by the National Institutes of Health (NIH) under award number R01NS48285, and by the National Science Foundation (NSF) Collaborative Research in Computational Neuroscience (CRCNS) program under award number IOS-1131948. Any conclusions or opinions are those of the authors and do not necessarily represent the official views of the sponsoring agencies.
Douglas Ollerenshaw, et al., “The adaptive trade-off between detection and discrimination in cortical representations and behavior,” DOI: 10.1016/j.neuron.2014.01.025 (NEURON, March 2014 ).