Memory capacity of brain 10 times more than thought

  • New measurements have exploded the previous estimates of the human brain's memory capacity, and also help explain how neurons have such computational power when their energy use is so low.

The question of the brain's capacity usually brings up remarks that the human brain contains about 100 billion neurons. If each one has, say, 1,000 or more connections to other neurons, this produces some 100 trillion connections in which our memory can be held. These connections are between synapses, which change in strength and size when activated. These changes are a critical part of the memory code. In fact, synaptic strength is analogous to the 1s and 0s that computers use to encode information.

But, here's the thing: unlike the binary code of computers, there are more than two sizes available to synapses. On the basis of the not-very-precise tools researchers had available, they had come up with three sizes: small, medium and large. They also had calculated that the difference between the smallest and largest was a factor of 60.

Here is where the new work comes in, because new techniques have enabled researchers to now see that synapses have far more options open to them. Synapses can, it seems, vary by as little as 8%, creating a possible 26 different sizes available, which corresponds to storing 4.7 bits of information at each synapse, as opposed to one or two.

Despite the precision that this 8% speaks to, hippocampal synapses are notoriously unreliable, with signals typically activating the next neuron only 10-20% of the time. But this seeming unreliability is a feature not a bug. It means a single spike isn't going to do the job; what's needed is a stable change in synaptic strength, which comes from repeated and averaged inputs. Synapses are constantly adjusting, averaging out their success and failure rates over time.

The researchers calculate that, for the smallest synapses, about 1,500 events cause a change in their size/ability (20 minutes), while for the largest synapses, only a couple hundred signaling events (1 to 2 minutes) cause a change. In other words, every 2 to 20 minutes, your synapses are going up or down to the next size, in response to the signals they're receiving.

Based on this new information, the new estimate is that the brain can hold at least a petabyte of information, about as much as the World Wide Web currently holds. This is ten times more than previously estimated.

At the moment, only hippocampal neurons have been investigated. More work is needed to determine whether the same is true across the brain.

In the meantime, the work has given us a better notion of how memories are encoded in the brain, increased the potential capacity of the human brain, and offers a new way of thinking about information networks that may enable engineers to build better, more energy-efficient, computers.

http://www.eurekalert.org/pub_releases/2016-01/si-mco012016.php

http://www.scientificamerican.com/article/new-estimate-boosts-the-human-brain-s-memory-capacity-10-fold/

Full text at http://elifesciences.org/content/4/e10778v2

Reference: 

Related News

Research has shown that people are generally poor at predicting how likely they are to remember something.

Most memory research has concerned itself with learning over time, but many memories, of course, become fixed in our mind after only one experience. The mechanism by which we acquire knowledge from single events is not well understood, but a new study sheds some light on it.

A study involving 125 younger (average age 19) and older (average age 69) adults has revealed that while younger adults showed better explicit learning, older adults were better at implicit learning. Implicit memory is our unconscious memory, which influences behavior without our awareness.

Two experiments involving a total of 191 volunteers have investigated the parameters of sleep’s effect on learning.

We have thought of memory problems principally in terms of forgetting, but using a new experimental method with amnesic animals has revealed that confusion between memories, rather than loss of memory, may be more important.

In a recent study, volunteers were asked to solve a problem known as the Tower of Hanoi, a game in which you have to move stacked disks from one peg to another.

We know active learning is better than passive learning, but for the first time a study gives us some idea of how that works. Participants in the imaging study were asked to memorize an array of objects and their exact locations in a grid on a computer screen.

If our brains are full of clusters of neurons resolutely only responding to specific features (as suggested in my earlier report), how do we bring it all together, and how do we switch from one point of interest to another?

The role of sleep in consolidating memory is now well-established, but recent research suggests that sleep also reorganizes memories, picking out the emotional details and reconfiguring the memories to help you produce new and creative ideas.

A study involving young (average age 22) and older adults (average age 77) showed participants pictures of overlapping faces and places (houses and buildings) and asked them to identify the gender of the person.

Pages

Subscribe to Latest newsSubscribe to Latest newsSubscribe to Latest health newsSubscribe to Latest news
Error | About memory

Error

The website encountered an unexpected error. Please try again later.