This is just a link to John Walker’s fourmilab pages.
When considering computation, in nature as well as on digital computers, there’s a tendency to focus on the computing element—the CPU chip in present-day computers, for example. But the magic in computing is really in the storage (or, as it’s now usually called, memory).
Quite true. And there is more to it. All brains are built from computing elements – neurons are specialized body cells – that are slow, somewhat unreliable from an EE point of view, and have very limited computing power. The real magic occurs not in the central processing unit but in the complex dendritic wiring and the synaptic junctions. A single cortical column of a rat has 1E4 neurons, but the number of synapses is roughly the square, 1E8. ( I don’t think this is a coincidence ). That’s a lot of memory for such a basic unit.
We can recognize a familiar face in a matter of a few dozens of milliseconds ( 25 ms, 40 Hz ). The slow switching speed of neurons rules out any complex, iterative algorithm. Each brain area receives its input, does something, and sends out its result as soon as possible. Recognition is, in a very fundamental sense, just memory lookup! Any approach that does not use huge amounts of memory is suspect from the outset. The slow processing units limit the class of useful algorithms, too. It should be some kind of finite state machine, a lexer or parser, operating on stored maps from Input × State ⟶ State.