Tuesday, August 14, 2012
We have an example of an intelligence made out of matter, our own brain.
Whatever human-level intelligence is, it can be done on hardware of that complexity.
So it's worth asking what the complexity of that hardware is.
We think that whatever it is the brain actually does, the important level for understanding it is the neuron/synapse level.
Neurons look like transistors with thousands of legs and one output. Synapses are the connections between them.
The transistors that we know invariably have two legs. But we can chain them together to make a structure with a thousand legs and one output. We need a thousand transistors to do that.
So it could be that to make a working model of a brain, which implements the same algorithm, we'd need as many transistors as the human brain has synapses.
It's estimated that there are 100 000 000 000 000 000 synapses in a brain! (10^14)
Our largest commercial processors have something like 1 000 000 000 transistors (10^9).
So at first glance, it looks like we're five orders of magnitude short. That's a long way. It's the difference between a 1950s computer and a year 2000 computer. And we've got absolutely no way to predict whether the development of computers will continue at the astonishing rate that it's been going at over the last half of the twentieth century.
So most people who worry about the Singularity think that it's waiting on the hardware, and that the necessary hardware may never come.
But consider. The rate at which neurons can fire is something like 1000Hz. The speed of transistors in current chips is more like 3 GHz. That's six orders of magnitude.
It's always been our experience that making a computer ten times faster gives it more speed than making it ten times larger.
So I reckon that if we could work out what it is the brain is actually doing, then we could, on the sort of cheap hardware that lives in your desktop computer, make something that could do what the brain does, but at 10 times the speed.
And all my estimates above are highly conservative. Individual transistors can run much faster than GHz speeds. The speed limit for processor chips is to do with heat output and synchronization of the clock pulses across the entire chip. A neuron with 1000 inputs and one output probably can't do as much information processing as 1000 transistors.
And Moore's Law is currently continuing steadily. And whatever the brain does it is highly parallel, and nothing stops us linking processors together to get more total transistors, so if we knew how the brain worked and had a proper research budget to build a good one with, we should be able to make a brain that ran at least 1000 times faster than our own brains do.
But a lot of what our brains do is stuff like speech recognition, or vision. Computationally very hard, but not what we think of when we think of 'intelligence'. Whenever humans try to think about engineering problems, we run into having to do mathematics, or memorizing complex things. We are unbelievably slow at that sort of thing, which is actually computationally very simple.
At calculation, memory, symbolic manipulation, almost all the things that human intellectuals have traditionally found most difficult and most impressive, computer brains just blow us away by factors of billions.
These things are not natural for us. They take a great deal of training and practice. Consider the difference between a human child learning her own language and a human child learning the calculus.
Or your own ability to read emotions on strangers' faces versus your ability to solve simple probability questions.
This says to me that in order to do those things, we're in fact using mechanisms that were designed (by evolution) to do something else, and our schooling is a process of carefully repurposing and perverting these mechanisms to do new things, but to do them really badly and slowly.
A real computer brain could probably outspeed us by a factor of 1000 at the things that we're really good at, and that the great majority of humans can do very well, like catching cricket balls, or walking on two legs over rough ground, or imagining what is going on in other human brains, or finding our way home along a route after we've just walked a long way without a map, or recognising where would be a nice place to live, or throwing rocks, or throwing punches.
But at the things that we've traditionally thought of as our highest intellectual achievements, the things that very few humans can do well, which look more like accidental consequences of whatever our brains are actually for, things like mathematics and planning and logic and probability and scientific discovery, I would imagine that the speed factor would be more like several billion.
Luckily, we have absolutely no idea how the brain works, or what computations it performs.
If we knew how the brain worked, and what information processing it performed to work its miracles, and someone were fool enough to build a computer to do those computations, then they would have created a being which could recapitulate the whole of human scientific thought from Aristotle to Einstein in a few minutes.
That wouldn't really be something I'd like to play chess with. I've no idea how it would play. But I'm damned sure it would win.
Posted by John Lawrence Aspden at 4:07 PM