Saturday, September 29, 2007
Computational horsepower: Natural vs artificial systems
An interesting exercise in dimensional analysis:
There are about 10^11 neurons in the brain, with about 10,000 synapses per neuron, yielding approximately 10^15 synapses total. Extracellular electrophysiology would have you believe that the average pyramidal (excitatory) neuron fires at about 10-20 Hz (inhibitory neurons fire even more rapidly), but there is a strong bias towards recording from more active neurons. It's hard to locate silent neurons with an extracellular electrode, since you can only determine that you are near a neuron when it fires. Intracellular electrodes may also be biased towards larger cells, since they are probably easier to spear (or clamp onto, depending upon technique), but almost all in vivo recordings are performed using extracellular electrodes. Arguments based upon metabolic rates suggest that the average firing rate is closer to 1 Hz, but I don't have the actual papers at my fingertips. I'll fish it out the reference if challenged. The traditional leaky-integrate-and-fire model of neural activity suggests that depolarization / shunting / hyperpolarization delivered to synapses throughout the dendritic tree and soma sum linearly in the soma (subject to a low-pas filter), and the neuron fires an action potential when its membrane potential exceeds some threshold. Polsky, Mel, and Schiller (2004), amongst other recent papers, implies that individual dendrites or dendritic compartments actually contain separate computational subunits, utilizing the nonlinear voltage-gated and NMDA channels to perform a thresholding operation similar to that normally ascribed to the axon hillock (where action potentials are initiated).
Let's be generous and assume that each thresholding operation roughly corresponds to a single floating point operation. We can also reasonably estimate that there are 100 such thresholding sites in the dendritic tree, and they perform thresholding operations (i.e., floating point operations) at the same rate that their inputs fire. The human brain then runs at approximately 10^11 neurons * 10^2 thresholding sites / neuron * 1 Hz = 10^13 flops. Bounding our estimate on the other side by assuming that each synapse performs a floating point operation each time it receives a spike and that the average neuron fires at 10 Hz, we find that the brain performs at most about 10^15 synapses * 10 Hz = 10^16 flops. The world's fastest supercomputers run at about 100 teraflops, or 10^14 flops, whereas desktop computers can achieve about 10^10 flops. Furthermore, Moore's law implies that computer performance should double approximately every 24 months. This suggests that computers are very close to the computational capacity of the brain.
Of course, this doesn't imply that we can simulate such a brain. The biophysics underlying these abstract computations is orders of magnitude more complex than the computations themselves. The blue brain project is currently struggling to simulate a single cortical column with a scant 10,000 neurons. To make efficient use of our available computational power to simulate the high-level activity of the brain, we would first need to know the basic algorithms the brain is computing.
There are about 10^11 neurons in the brain, with about 10,000 synapses per neuron, yielding approximately 10^15 synapses total. Extracellular electrophysiology would have you believe that the average pyramidal (excitatory) neuron fires at about 10-20 Hz (inhibitory neurons fire even more rapidly), but there is a strong bias towards recording from more active neurons. It's hard to locate silent neurons with an extracellular electrode, since you can only determine that you are near a neuron when it fires. Intracellular electrodes may also be biased towards larger cells, since they are probably easier to spear (or clamp onto, depending upon technique), but almost all in vivo recordings are performed using extracellular electrodes. Arguments based upon metabolic rates suggest that the average firing rate is closer to 1 Hz, but I don't have the actual papers at my fingertips. I'll fish it out the reference if challenged. The traditional leaky-integrate-and-fire model of neural activity suggests that depolarization / shunting / hyperpolarization delivered to synapses throughout the dendritic tree and soma sum linearly in the soma (subject to a low-pas filter), and the neuron fires an action potential when its membrane potential exceeds some threshold. Polsky, Mel, and Schiller (2004), amongst other recent papers, implies that individual dendrites or dendritic compartments actually contain separate computational subunits, utilizing the nonlinear voltage-gated and NMDA channels to perform a thresholding operation similar to that normally ascribed to the axon hillock (where action potentials are initiated).
Let's be generous and assume that each thresholding operation roughly corresponds to a single floating point operation. We can also reasonably estimate that there are 100 such thresholding sites in the dendritic tree, and they perform thresholding operations (i.e., floating point operations) at the same rate that their inputs fire. The human brain then runs at approximately 10^11 neurons * 10^2 thresholding sites / neuron * 1 Hz = 10^13 flops. Bounding our estimate on the other side by assuming that each synapse performs a floating point operation each time it receives a spike and that the average neuron fires at 10 Hz, we find that the brain performs at most about 10^15 synapses * 10 Hz = 10^16 flops. The world's fastest supercomputers run at about 100 teraflops, or 10^14 flops, whereas desktop computers can achieve about 10^10 flops. Furthermore, Moore's law implies that computer performance should double approximately every 24 months. This suggests that computers are very close to the computational capacity of the brain.
Of course, this doesn't imply that we can simulate such a brain. The biophysics underlying these abstract computations is orders of magnitude more complex than the computations themselves. The blue brain project is currently struggling to simulate a single cortical column with a scant 10,000 neurons. To make efficient use of our available computational power to simulate the high-level activity of the brain, we would first need to know the basic algorithms the brain is computing.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment