a new method could increase our understanding of learning, aging and diseases that erode connections in the brain.
by rick richardson
technology this week
a new study confirms that the brain may be able to hold nearly 10 times more information than previously thought.
more: how many ev charging stations will we need? | google launches new private space feature | microsoft discovers a security flaw impacting android apps | what is an ai pc, and should i get one? | fool’s gold may not be so foolish now | ai-engineered enzyme could be solution to plastic pollution | german researchers develop ev motor with no rare-earth magnets
exclusively for pro members. log in here or 2022世界杯足球排名 today.
like computers, the brain’s memory storage is measured in “bits,” and the number of bits it can hold rests on the connections between its neurons, known as synapses. historically, scientists thought synapses came in a few sizes and strengths, limiting the brain’s storage capacity. however, this theory has been challenged, and the new study further backs the idea that the brain can hold about 10 times more than previously thought.
in the new study, researchers developed an exact method to assess the strength of connections between neurons in a part of a rat’s brain. these synapses form the basis of learning and memory, as brain cells communicate at these points and thus store and share information.
time in the brain doesn’t follow the steady ticking of the world’s most precise clocks.
by better understanding how synapses strengthen and weaken and by how much, the scientists more precisely quantified how much information these connections can store. the analysis, published in the journal neural computation, demonstrates how this new method could not only increase our understanding of learning but also of aging and diseases that erode connections in the brain.
“these approaches get at the heart of the information processing capacity of neural circuits,” said jai yu, an assistant professor of neurophysiology at the university of chicago who was not involved in the research. “being able to estimate how much information might be represented is an important step toward understanding the capacity of the brain to perform complex computations,” he added.
in the human brain, there are over 100 trillion synapses between neurons. chemical messengers are launched across these synapses, facilitating the transfer of information across the brain. as we learn, transferring information through specific synapses increases. this “strengthening” of synapses enables us to retain new information. in general, synapses strengthen or weaken in response to how active their constituent neurons are – a phenomenon called synaptic plasticity.
however, as we age or develop neurological diseases, such as alzheimer’s, our synapses become less active and thus weaken, reducing cognitive performance and our ability to store and retrieve memories.
scientists can measure the strength of synapses by looking at their physical characteristics. messages sent by one neuron will sometimes activate a pair of synapses, and scientists can use these pairs to study the precision of synaptic plasticity. in other words, given the same message, does each synapse in the pair strengthen or weaken in the same way?
measuring the precision of synaptic plasticity and the amount of information any given synapse can store has proven difficult in the past. the new study changes that.
to measure synaptic strength and plasticity, the team harnessed information theory, a mathematical way of understanding how information is transmitted through a system. this approach also enables scientists to quantify how much data can be transmitted across synapses while also considering the “background noise” of the brain.
this transmitted information is measured in bits, such that a synapse with a higher number of bits can store more information than one with fewer bits, said terrence sejnowski, co-senior study author and head of the computational neurobiology laboratory at the salk institute for biological studies. one bit corresponds to a synapse sending transmissions at two strengths, while two bits allow for four strengths, and so on.
the team analyzed pairs of synapses from a rat hippocampus, a region of the brain that plays a significant role in learning and memory formation. these synapse pairs were neighbors and activated in response to the same type and amount of brain signals. the team determined that, given the same input, these pairs strengthened or weakened by the same amount, suggesting the brain is highly precise when adjusting a synapse’s strength.
the analysis suggested that synapses in the hippocampus can store between 4.1 and 4.6 bits of information. the researchers had reached a similar conclusion in an earlier study of the rat brain, but they’d crunched the data with a less precise method. the new study helps confirm what many neuroscientists now assume – that synapses carry much more than one bit each, said kevin fox, a professor of neuroscience at cardiff university in the uk who was not involved in the research.
the findings are based on a tiny area of the rat hippocampus, so it’s unclear how they’d scale to a whole rat or human brain. it would be interesting to determine how this capacity for information storage varies across the brain and between species, yu said.
in the future, the team’s method could also compare the storage capacity of different areas of the brain, fox said. it could also study a single area of a single brain region in both healthy and pathological conditions.