Feeds

Are brains analog, or digital?

Hell breaks loose after Cornell claim

  • alert
  • submit to reddit

Secure remote control for conventional and virtual desktops

We can't turn off Analog

Did this make sense to neuroscientist Dr Bill Softky? Softky studied under Carver Mead at CalTech and recently worked at the Redwood Neuroscience Institute. He also knows a little about how computers work, and was awarded a prestige Microsoft prize for his work on the Intrinsia debugger - even though he's never worked for Microsoft.

"Because there is no accepted answer for how the brain works, people can say anything. The threshold for disproving something is higher than the threshold for saying it, which is a recipe for the accumulation of bullshit," he says.

Go on, Bill, tell us what you really think.

As people throw the computer metaphor around, they appear to be talking about quite differing ideas of what a computer really is, and the conversations seem to fly right past each other. Softky appears to be on safer empirical ground on what we might more usefully call a "circuitry" metaphor.

"We know that with brain neurons, at least 90 per cent of the bandwidth they use is digital. There is a fibre or there is not a fibre, there is a pulse or there is not a pulse; there is a ground truth to what we see," he explains.

"People see analog signals sloshing around the brain in MRI scans all the time, but much of that is from instrumental blur; the individual circuit elements and computations are still digital."

The two views then, would appear to be irreconcilable.

Or maybe not. Dr Softky says he agrees with much of what Professor Spivey values too, as expressed in a Slashdot post, where he defends the value of probablistic algorithms and neural networks, "(programmed on digitial computers)" as useful and informative ways to build simulations of various human mental processes.

So is saying that the brain works like an analog computer a better fit? Analog computing is still capable of extaordinary computation quite beyond the capabilties of a Turing machine, or a digital computer. You can see its appeal.

Today's digital computers are fast and cheap, easily produced, and ubiquitous, but that's the consequence of politics. And so analog computing is economically and philosophically unacceptable now we live in the "digital era", or "information society", or however it's branded this week. Such arguments neglect the fact that today's "economics" is entirely an agent of politics. In other words, what science does is exactly what we deem acceptable. And so, as Stephen Jay Gould pointed out, just as you can't unwind the tape of evolution and be guaranteed an instant and indentical replay, you can't assume that science would trick out just as it did. Science has always been contingent on economics and ethics, things which science can help inform, but ultimately can't decide for itself.

But it gets interesting. The two views are apparently not irreconcilable.

"I like his point about Hidden Markov models, which are practically the only 'perceptual' system of any use at all. Brains definitely work by trying to figure out what's in the world, and to do that they need probability estimates, which means analog values assigned to multiple possibilities at once (i.e. no clear "winner" at first)," says Softky.

(Jeff Hawkins took some convincing on the value of probabilities, but is now a convert.)

Softky continues -

"But it's a long way from saying that the brain represents probabilities, to somehow saying that's all it does, or saying the circuite elements do that."

"By analogy: speedsheets, tax software, cell phones, and music players all clearly represent analog values too. Does that mean they're 'analog computers'?" he asks.

"Virtually every electronic device that deals with analog information uses digital elements to do so (except for old-fashioned radios and tape players)."

"The outside world is analog in many respects, so brains have to reflect that. But efficient and modular processing elements are digital, so brains ought to take advantage of that too."

Then again, perhaps it's time the computer metaphor was dropped altogether from what may still exist as the "hard sciences", and certainly from the social sciences. Most of today's premiers obliging fire up the computer metaphor when they're stuck for ideas.

Spivey's description of a computer fails to apply to analog hardware, but it doesn't really apply to vector or superscalar processors either. And are we describing the 'chip', or the 'system'? If we can't agree to agree on what a computer looks like, we can't really begin to guess which kind of computer we most resemble. (And at this point you wonder if grown-ups haven't got better things to do than debate such things. What kind of carpet are we? Or car key? Who cares?)

So comparing the brain to a computer isn't fair to humans, and it isn't even fair to computers. You have to allow science to be science, and the metaphors to live their own life. Neither party can't make what David Letterman might call an "assgrab", in order to claim the authority of the other - and recent history is full of such presumptions - without anticipating some pushback. Both camps have, of late, settled into a state of snarling mistrust.

What Spivey's excercise seems to prove is that you can't attempt to "unite" the two camps unless one's own work is very firmly on terra firma to begin with. And when we see Spivey's students moosing around with a mouse, we can see it isn't. ®

Bootnote: But if you want to hear more on the wonders, and limitations of analog - just mail us, and we'll do our best to oblige. ®

Related stories

Scientists see women's brains switch off during sex
Software engineers - the ultimate brain scientists?
Design patterns for a Black Box Brain?
Swiss neurologists to model the brain
Hoax paper fools cybernetic boffins

Next gen security for virtualised datacentres

More from The Register

next story
Gigantic toothless 'DRAGONS' dominated Earth's early skies
Gummy pterosaurs outlived toothy competitors
Vulture 2 takes a battering in 100km/h test run
Still in one piece, but we're going to need MORE POWER
TRIANGULAR orbits will help Rosetta to get up close with Comet 67P
Probe will be just 10km from Space Duck in October
Boffins ID freakish spine-smothered prehistoric critter: The CLAW gave it away
Bizarre-looking creature actually related to velvet worms
CRR-CRRRK, beep, beep: Mars space truck backs out of slippery sand trap
Curiosity finds new drilling target after course correction
'Leccy racer whacks petrols in Oz race
ELMOFO rakes in two wins in sanctioned race
What does a flashmob of 1,024 robots look like? Just like this
Sorry, Harvard, did you say kilobots or KILLER BOTS?
NASA's rock'n'roll shock: ROLLING STONE FOUND ON MARS
No sign of Ziggy Stardust and his band
Why your mum was WRONG about whiffy tattooed people
They're a future source of RENEWABLE ENERGY
prev story

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Build a business case: developing custom apps
Learn how to maximize the value of custom applications by accelerating and simplifying their development.
Rethinking backup and recovery in the modern data center
Combining intelligence, operational analytics, and automation to enable efficient, data-driven IT organizations using the HP ABR approach.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.