Brain Cells More Powerful Than You Think

WEDNESDAY Dec. 19, 2007 -- The human brain constantly sorts through its 1 trillion cells, looking for perhaps only one or a handful of neurons to carry out a particular action, a trio of new studies says.

The research, conducted with rodents and published in the Dec. 20 issue of Nature, could rewrite the textbooks on just how important individual brain cells or cell clusters are to the working mind.

Before these insights, "The thinking was that very large ensembles of neurons [brain cells] had to be activated at some point for the animal to feel or perceive" a stimulus, explained the senior researcher of two of the studies, Karel Svoboda, a group leader at the Howard Hughes Medical Institute in Ashburn, Va.

"But it turns out that a remarkably small number -- on the order of 50 or so activated neurons -- is sufficient to drive reliable behaviors," said Svoboda, who is also associated with the Cold Spring Harbor Laboratory, in New York.

Another study, this one conducted by scientists at Humboldt University Berlin and Erasmus Medical Center in Rotterdam, the Netherlands, found that stimulating just one out of the estimated 100 million neurons in a rat's brain was enough to cause the rodent to act differently.

"The fact that a single cell can influence behavior in the cortex is fascinating," said neuroscientist Paul Sanberg, director of the Center for Excellence for Aging and Brain Repair at the University of South Florida, Tampa. The new findings are "allowing us to answer questions about how the brain controls behavior at the cellular level," added Sanberg, who was not involved in the studies.

In one of the studies, Svoboda and his colleagues genetically engineered a select few brain cells in active mice so that the cells would react to a light stimulus.

Then they exposed a part of the rodent's brain and placed a small light-emitting diode over the area. The experiment "was essentially a trick to stimulate [only] these cells," Svoboda explained.

Finally, they adjusted the amount of light downward until they found the lowest number of brain cells needed to evoke a measurable response in the mice. That number turned out to be less than 50 -- much fewer than the wide-flung networks of cellular activation neuroscientists had previously assumed would be necessary, Svoboda said.

The mouse brain's ability to tap into a mere 50 cells is even more remarkable when you consider that the activity of this cluster of cells takes place amid a background roar of other neurological "noise" from millions of cells, he said.

"At the same time, the functional brain area just chatters along and produces perhaps a hundred thousand spontaneous action potentials [electrical signals]," he noted. "So, the brain can actually distinguish the tiny, tiny number of action potentials from that huge background."

According to Svoboda, the experiment strongly supports a theory of brain function called "sparse coding," in which "neurons that listen to the neurons that we have activated have to be able to pull out very sparse subsets of activity."

In another study, Svoboda and co-researcher Christopher Harvey, also of the HHMI and Cold Spring Harbor Laboratory, focused on the synapse -- the microscopic gap separating individual neurons. Messages are passed neuron-to-neuron across the synapse by a complex mechanism of electrochemical signaling.

"Scientists had shown that synapses behave rather independently," Svoboda said, so that long-term electrical activation ("potentiation") of one synapse didn't directly affect a neighboring synapse. Long-term potentiation is, in essence, the key cellular step in how the brain lays down memory.

However, computer models had suggested that activation at one synapse might more subtly strengthen the synapses around it. In their experiments, Svoboda and Harvey found this to be true.

They report that "neighborhoods" of 10 or 20 synapses "influence each other cooperatively," strengthening discrete groups of synapses.

What's more, this type of synaptic teamwork happens within a specific time-frame -- about 10 minutes, a perfect amount of time for laying down the kinds of memories that can lead to learning, Svoboda said.

"That's a very behavioral timescale for learning and memory," he said. For example, a mouse can be placed in a chamber, explore it for a few minutes, then be removed from the chamber and yet retain a working memory of that chamber once it has been reintroduced to it.

That's probably due to the fact that the mouse's brain formed synaptic clusters (i.e., memory) specific to the new chamber while it was exploring it, Svoboda explained.

"In this way, they can be dissociated [from the stimulus] over several minutes but still lead to learning," he said.

While many of these experiments were done in mice, the human brain should work similarly, albeit on a much larger scale, Svoboda said. While the mouse brain contains about 100 million neurons, human brains top out at a trillion such cells, he said.

And even though the research looked at healthy brain function, it may have implications for research into aging or diseased brains, as well.

"You need to understand the fundamental mechanisms. Then you can gain better insight into what might go wrong during neurodevelopmental and neurodegenerative disorders," Svoboda said.

Sanberg agreed.

"This work clearly shows us that all cells are important, and we should try and maintain and keep as many brain cells as possible," he said. "But the number is always flexible and, as you can see, even one cell can influence a number of others."

More information

Learn more about the human brain at The Franklin Institute.

Posted: December 2007


View comments

Hide
(web3)