Computers will disturb heat balance of universe, says Sun's Gage
JavaOne Molecular computation could eventually threaten a cosmic meltdown, thinks Sun co-founder John Gage. And he wasn't being frivolous.
"In fifty years, computation will be so complex, and so demanding of memory and working on devices of such intricacy" - such as a terabyte-on-a-sugarcube-storage - "that a single calculation could change the heat level of the universe," he says, citing Sun's CTO Greg Papadopoulos.
One of the pleasures of JavaOne is that it's possible to yank Sun's illuminati away from the lectures for long enough to compare geek gadgets. So The Register cornered the marvelous Gage, who was incidentally a delegate for RFK at the siege of Chicago in 1968, for long enough to chat about Firewire RAID arrays. And a computer-induced apocalypse...
Actually, if it sounds like Gage is trying to have a rhetorical doomsday race with fellow Sun founder Bill Joy - Joy recently warned that mankind could face being hostage to machines - Gage is pretty optimistic that smarter algorithms might save us in time.
The problem is, molecular level computing is advancing faster than anyone thought. What was doable in 15 to 20 years, now looks doable in five to ten, says Gage. And the complexity problems apply to both processors and storage.
"The most pressing problem in a physics sense lies in reducing heat. The machines we have now in the 16-18
million transistor complexity will soon be in the 100-120 million transistor complexity," he says. "And moving beyond not just planar architectures where you have a two-way grid but into 3D structures. Every time you alter something, it dissipates some energy - so where does it go?"
Back to the sugarcube of Beelzebub.
"If altering the shape of a protein is how you store a bit, then a little bit of energy is dissipated. Do that in a billion places, and it melts," says Gage. "We have to be very careful - it's an ecological balance we're disturbing by making a computation."
Which is where better brainwork might save us. "But usually what does happen is that there is some disjunct point where we have an equivalent increase in the rapidity of calculation by development of algorithms." He mentions the recent IEEE's choice of Top Ten Algorithms Of The Century
- and how several of them were quite recently devised.
So what about optical, we wondered, for when we were bairns Tomorrow's World constantly advised us
this would be the next-big-thing after silicon microprocessors?
Well, the breakthroughs are great news for Lucent he thinks, but not so great for computer manufacturers,
because the switching components aren't there. Although you can put a switch on anything - Babbage had mechanical switches as reliable as today's computers - it's not going to be easy developing them to the same degree of consistency as computer switches.
As for the gadgets. Gage enthused about the e-bike, brainchild of Lee Iacocca, former Ford and Chrysler boss. A Java equipped e-bike was on the show floor.
Gage also thinks that if Sony can play its cards right, the Playstation could find itself displacing the PC not only in the home but in business.
"Imagine a 30Gb hard disk in the Playstation2, with a DVD with a Firewire, with USB, with a processor and twin vector engines that are quite fast. Forget the set-top box this is the device that allows the entire world to communicate," he says, "all for $250."
"Hello 30Gb disk!" he then said rather cruelly, we thought, to our Psion Series 5. "I bet you're not 30Gb" Er, no, but size isn't everything John.
Apple G4 owner Gage had picked up one of those dinky FireWire RAID stacks at the San Francisco MacWorld
show, and as we'd just been talking about Gnutella and Napster, he could see massively distributed computing
coming home, eventually. We'd all be one big shared library.
But before it arrived we'd have run into the problem of the disappearance of the canonical version of a text.
Transmission errors and bit-flipping would mean we'd never know what was the genuine article: "Who's got a
copy of this particular text, or image, and who's got the original? The whole history of human endeavor is
deteriorating and will need to be checked." Sub-editors, apply here... ®