Moore's Law has ten years to run, predicts physicist
Silicon reaching the end of the road
Renowned theoretical physicist Michio Kaku has predicted Moore's Law will run out of steam within the next ten years as silicon designs run up against the laws of physics.
"In about ten years or so we will see the collapse of Moore's Law," he says. "In fact we already see a slowing down of Moore's Law. Computing power cannot maintain its rapid exponential rise using standard silicon technology."
Gordon Moore's famous 1965 prediction said that the amount of transistors that could be packed on a silicon chip would double every year, although he later amended this every two years. The prediction has stood up for far longer than Moore suggested (he originally envisaged a ten year run), albeit with some tweaking from Intel so that performance will double, not the number of transistors.
However, as silicon transistors get down to five nanometers and below, they will become useless due to overheating and electronic leakage Kaku predicts. Intel is boosting performance in other ways, such as the use of multi-core processors and tri-gate transistors currently found in the latest Ivy Bridge range, but these have a limit on silicon Kaku warned.
Quantum computing is one avenue which offers some ways forward but Kaku dismissed the technology as hopelessly immature, saying the first proper quantum systems won't come online until late in the 21st century. The more likely candidate is molecular computing he predicts, with the addition of optical chips also providing some support.
Michio Kaku is a very intelligent and well-respected futurologist, and his ideas have a lot of merit, although researchers into quantum computing might quibble with his timescale predictions. The decline of Moore's Law does however look inevitable, and it will be missed, not least by Intel which has built a marketing message around the concept.
If Moore's Law does fail it could also see the end of another time-honored tradition – the annual Intel Developer Forum sweepstake where attendees bet amongst themselves as to how long the opening keynote speaker will go before mentioning Gordon Moore's prediction. ®
And not a moment too soon!
The horrific bloated slop that passes for code these days is an embarrassment to anyone of pre GUI age.
Maybe when / if a processor cap appeared Moores law could be continued (in a fashion) by people dumping some of these lazy libraries and putting a bit more thought into their code so that the processor is actually doing something useful and not merely navigating it's way around excessive layers of pointless abstraction!
Oooh, time travel!
I think I've read this article already, around ten years ago!
often stated and pretty consistenly wrong
Since both Intel and AMD have been leaking their experiments with real multi-layer 3d chip designs, I'd say that in 10 years we will have hammered out another generation of fairly conventional silicon production by building up instead of shrinking. Considering the big problem with this has actually been heat(As in their test pentium 3 + ram dies melting in the lab) and there have been some interesting developments in that arena in the last few months, I am betting that we can look forward to quite a few more "Moore's Law" Keynotes. Then will probably come optical interconnects, persistent state RAM and host of other new tech.
Pity too, because when all of that runs out of headroom, things may actually get interesting. Until then, unless fabs stop costing tens of billions of dollars, things will probably stay incremental and safe and dull. Though I do wonder if AMD will be around to see it(as anything other than a GPU house at least).