Monday, November 3, 2008

Are Brains Digital or Analog?

Last year Chris Chatham wrote up a great post entitled 10 Important Differences Between Brains and Computers. There's a wealth of topics to discuss in reference to the post, but I want to focus on what he lists as the #1 difference:
Brains are analogue; computers are digital
It's easy to think that neurons are essentially binary, given that they fire an action potential if they reach a certain threshold, and otherwise do not fire. This superficial similarity to digital "1's and 0's" belies a wide variety of continuous and non-linear processes that directly influence neuronal processing.

For example, one of the primary mechanisms of information transmission appears to be the rate at which neurons fire - an essentially continuous variable.
There's more, but this is the essential part. What's interesting is that Chris points out that one way in which information is conveyed in the brain is by the rate of fire of neurons. But then he ignores the fact that we know that information is carried by the timing of individual spikes, and he categorically labels the brain as "analogue."

A nice metaphor for neurons is a leaky bucket. When they are receiving incoming activity from other neurons, you can think of that as water trickling into the bucket. This is analogous to a charge building up on the cell membrane. But the membrane has resistance, so it is "leaky", which means in the bucket analogy that there are is also a tiny hole in the bottom of the bucket. If the water you're putting in doesn't exceed the leakage, then the level of water will never rise. Once the water reaches a particular level, a threshold, then you can think of the bucket being tipped over and sending all its water to all the other buckets it connects to. This is analogous to the firing of a neuron. It then resets until it is filled back up to its threshold level.

So before it reaches threshold, a neuron functions in an analog fashion. When it reaches threshold, it generates an action potential, or spike, which is a binary signal. But then, as Chris points out, if we count the number of spikes within a given time frame, that rate of fire can be measured in an analog fashion.

In my own work, I used to use artificial neurons that modeled only the average rate of fire of neurons. These are known as rate-coding neuron models, and a very common function that approximates the firing rate is the sigmoid:

But if you use such a model, you're assuming that no information is being carried by the timing of individual spikes, because you're averaging that information away.

But we know of particular examples in which information is conveyed by individual spikes. One very famous and interesting example is the auditory system of the barn owl. See "Hebbian learning of pulse timing in the barn owl auditory system" by Wulfram Gerstner, Richard Kempter, J. Leo Van Hemmen, Hermann Wagner for a great overview.


Basically, when a mouse makes a sound, the sound waves reach each ear of the barn owl at different times, because the ears are spaced apart. The owls auditory system is able to determine where the sound came from by comparing the relative timing of the sound reaching each ear, and this information is learned and conveyed via the timing of individual spikes. By the way, the image isn't of a barn owl shooting a mouse with laser vision (though that would be cool). It's meant to show how the sound of the mouse squeak reaches each of the barn owl's ears at slightly different times.

We also know that an important aspect of learning throughout the brain involves the relative timing of individual spikes. If a neuron (A) fires just before the neuron (B) it is connected downstream to, then the synapse will "strengthened", or modified in such a way that the next time neuron A fires, it will be more likely to cause neuron B to fire:

However, if the order of firing is reversed, then the synapse is "weakened":

The strengthening and weakening of synapses in this way is known as spike-timing dependent plasticity, or STDP. While there are a number of other ways in which synapses are modified in the brain, these particular mechanisms are thought to underlie many important aspects of learning, and they should not be ignored.

So, the answer to the question "Are Brains Digital or Analog?" is a perhaps unsatisfying "both". Some of the ways in which neurons communicate and undergo modification via learning are based purely on all-or-nothing signals in a digital way. In other cases, information is conveyed by the rate of fire of neurons in an analog manner.

But then, computers emulate analog functions as well, so they are neither distinctly digital or analog. In fact, the dichotomy turns out not to be all that sharp in many domains. What's important is to know in what ways the brain is analog and in what ways it is digital. Both will likely figure into any coherent explanation of how the brain works.

For my own part, I've begun working with spiking neuron models, specifically what are known as leaky integrate-and-fire models. I've become increasingly convinced about the importance of the role of time in understanding cognitive processes, and spiking models allow for communicating information both by the timing of individual spikes and by their rate of fire, while rate-coding models only allow for communication via average firing rates. That's not to say that rate-coding models don't have a lot to teach us about certain aspects of cognition, just that they are limited in their ability to do so.

No comments: