“The brain’s memory capacity is in the petabyte range, as much as the entire Web, data from the Salk Institute show; may lead to more energy-efficient computers…”
“Salk researchers and collaborators have achieved critical insight into the size of neural connections, putting the memory capacity of the brain far higher than common estimates. The new work also answers a longstanding question as to how the brain is so energy efficient, and could help engineers build computers that are incredibly powerful but also conserve energy.
‘This is a real bombshell in the field of neuroscience,’ says Terry Sejnowski, Salk professor and co-senior author of the paper, which was published in eLife. ‘We discovered the key to unlocking the design principle for how hippocampal neurons function with low energy but high computation power. Our new measurements of the brain’s memory capacity increase conservative estimates by a factor of 10 to at least a petabyte (1 quadrillion or 1015 bytes), in the same ballpark as the World Wide Web.’
‘When we first reconstructed every dendrite, axon, glial process, and synapse* from a volume of hippocampus the size of a single red blood cell, we were somewhat bewildered by the complexity and diversity amongst the synapses,’ says Kristen Harris, co-senior author of the work and professor of neuroscience at the University of Texas, Austin. ‘While I had hoped to learn fundamental principles about how the brain is organized from these detailed reconstructions, I have been truly amazed at the precision obtained in the analyses of this report.’…”