A researcher at UC-San Diego is methodically slicing and preparing one of the world's most famous brains (where "world" is narrowly defined as the world of cognitive science -- I suspect HM isn't a household name, though he was a huge anonymous celebrity among psychologists and neuroscientists for the last half-century). The hope is to make the data electronically available.
An interesting fact tucked in at the end of the article is that in order to make it possible to zoom down to the cellular level on each electronic copy of each slide will require 1 - 10 terabytes per slide. That's terabytes with a t. There are 2,400 slices of the brain (I'm not clear as to whether all will become slides, but presumably a good fraction will be). And the researcher wants to eventually expand the project to 500 brains.
This creates a serious storage issue.
It also brings up the question of building synthetic brains. If we need something on the order of 5,000 terabytes just to render a digital image of a brain, how many do we need to perfectly model a brain?
Granted, there aren't 5,000 trillion neurons in a brain (there are about 100 billion). But one neuron doesn't equal one bit -- a neuron's behavior is complex, and it's pieces do things. As we don't fully understand what neurons do, I take it as uncontroversial we don't know how many 'bits' make up a neuron.
This is one of the complications with predicting the future of neuroscience. Our knowledge and technology are growing exponentially, but we don't know how far there is to go.
How to calculate trigonometry functions
11 hours ago in Doc Madhattan