Intel's lab crew makes case for 80-core world
CSI: It's optical
Exclusive I have seen the future. It's full of Agilent testing equipment, clunky Nvidia drivers and enthusiastic, well-educated men.
The future didn't always look this way - at least not at Intel, where "GHz=God" wallpaper used to cover cubicle walls. The chip maker once indoctrinated workers with the religion of speed and did everything possible to convince consumers that a 2.0GHz chip made life so much more bearable than a 1.8GHz part. Intel relied on GHz tweaks to feel good about itself and thought about speed, speed, speed all the time.
Under these conditions, Intel's future appeared white hot. Well, actually, it was more of a rocket nozzle/surface of the sun hot. You all remember the slide Intel's then VP Pat Gelsinger would toss out to show just how hot Intel could make a chip in two, five and ten years' time.
But then the industry shifted to multi-core chips where the so-called "platform" matters more than GHz. Getting consumers and software makers to embrace the "platform" mentality takes some serious work. At Intel, it's the labs teams that have accepted the brain-bending challenge.
Where the Intel of today has four-core processors, the Intel of tomorrow will have 80-core, 100-core and 120-core chips.
Explaining the need for so many cores proves easy enough for the server set. Customers at research labs and giant companies always want more horsepower.
These clients are already being fed a decent helping of multi-threaded software that can fly across the complex, multi-core processors. Myriad benchmarks exist that show the software performance improvements derived from multi-core chips. Real world customer code often tells the same story. So, while some still long for relentless single thread boosts, the corporate majority has accepted the multi-core future.
The case for so-called terascale processors on the PC seems tougher to make.
Sponsored: Hyper-scale data management