Original URL: http://www.theregister.co.uk/2007/03/20/intel_labs_mario/

Intel's lab crew makes case for 80-core world

CSI: It's optical

By Ashlee Vance

Posted in Hardware, 20th March 2007 23:33 GMT

Exclusive I have seen the future. It's full of Agilent testing equipment, clunky Nvidia drivers and enthusiastic, well-educated men.

The future didn't always look this way - at least not at Intel, where "GHz=God" wallpaper used to cover cubicle walls. The chip maker once indoctrinated workers with the religion of speed and did everything possible to convince consumers that a 2.0GHz chip made life so much more bearable than a 1.8GHz part. Intel relied on GHz tweaks to feel good about itself and thought about speed, speed, speed all the time.

Under these conditions, Intel's future appeared white hot. Well, actually, it was more of a rocket nozzle/surface of the sun hot. You all remember the slide Intel's then VP Pat Gelsinger would toss out to show just how hot Intel could make a chip in two, five and ten years' time.

But then the industry shifted to multi-core chips where the so-called "platform" matters more than GHz. Getting consumers and software makers to embrace the "platform" mentality takes some serious work. At Intel, it's the labs teams that have accepted the brain-bending challenge.

Terable PCs

Where the Intel of today has four-core processors, the Intel of tomorrow will have 80-core, 100-core and 120-core chips.

Explaining the need for so many cores proves easy enough for the server set. Customers at research labs and giant companies always want more horsepower.

These clients are already being fed a decent helping of multi-threaded software that can fly across the complex, multi-core processors. Myriad benchmarks exist that show the software performance improvements derived from multi-core chips. Real world customer code often tells the same story. So, while some still long for relentless single thread boosts, the corporate majority has accepted the multi-core future.

The case for so-called terascale processors on the PC seems tougher to make.

True enough, video game developers, companies such as Adobe with Photoshop and operating system makers can use multi-core chips for serious speedups. But why would the average consumer want to throw 80 cores at Word and Microsoft's excuse for a browser, Internet Explorer?

Intel's labs crew have developed a number of applications meant to demonstrate what a consumer could get out of an 80-core dynamo.

A well-cored computer could, for example, make those tedious home movies more tolerable both for the family producing the films and people subjected to screenings of "The Day Johnny Ate a Lollipop by Himself."

Intel and partners, for example, have created software that eliminates jitter from video recordings.

The wife's grand ski run has never looked better with her top form coming through clearly despite your shaky hands. Soon, a company such as YouTube could offer the jitter removal as an option, and the home PC could do the dirty work crunching code to improve the clip.

In the same arena, Intel has mastered an application that can scan a lengthy home video of, say, your son's soccer football game and pull out the highlights from when the youngling scored a goal or maimed that jerk kid from down the block. Intel's code searches the video for spikes in cheering or fierce on field activity to locate the best bits of a game. It can also zero in on individual players and track them throughout the contest.

"Grandma is not going to sit there for an hour watching the game, but she will sit there and watch a five-minute highlight reel," one Intel lab staffer told us, during our recent visit to the company's Santa Clara nerdery.

Intel also showed off some software that lets users manipulate objects on a screen with the aid of a PC camera. You can see one example of this software in action below – vulture bubbles always an excellent choice.

The Games People Play

While Intel hopes to gain the attention of customers outside of the gaming set, it can't help but show off what a terascale chip would do for a first person shooter.

Intel has spent the last couple of years pumping the idea that ray tracing – a rendering technique that provides more sophisticated interactions between light and objects - will conquer raster graphics.

Despite pursuing its own graphics chip agenda, Intel contends that mainstream, multi-core x86 chips will bring video games to life in ways that the GPU crowd cannot match. The physics calculations demanded by techniques such as ray tracing make a “general purpose” x86 chip ideal for producing the graphics-rich games of tomorrow.

“GPUs are not set up for the general purpose types of workloads that we're talking about,” noted one of Intel's researchers. “Physics is one of the most general workloads, and those types of calculations will fly on (the multi-core) chips.”

Besides being better suited to certain workloads, general purpose chips from Intel boast large on-chip memory stores. The limited local memory of GPUs hampers the products from “rendering complex scenes” in games, according to Intel.

Damn you developers, developers, developers

Even if Intel can convince consumers, game developers and the like of the terascale chips' merits, the company faces a huge coding conundrum.

Relatively few coders know how to take advantage of multi-core chips, and most of the folks who do know the way of multi-core are spending their time pushing out server code. So, it may be the case that Intel dumps a fantastic, super-powerful chip on the market and then has to wait years for Microsoft and crew to write software that flies on the silicon. To help fix this situation, Intel has put research and development dollars in the hands of tens of universities. It hopes to encourage the schools to set up multi-threaded coding courses.

Beyond software, there are more problems as well – this time for the hardware crew. (And with that, I give you a forbidden journalistic technique known as “burying the lede.”)

CSI Drama

Keeping an 80-core chip happy proves tougher than you might think.

Each core wants access to on-chip memory and wants it fast. These hungry cores don't have time to go running across precious silicon real estate looking for data.

To address this problem, Intel's researchers have been hammering away on the concept of 3-D memory where the cache rests above the processor cores. This technique lets a core anywhere on a single slice of silicon tap the cache in just a couple of cycles rather than traversing the entire chip to reach the cache bank.

Intel's researchers have also pioneered technology in the field of silicon photonics.

Here Intel looks to replace the wires that connect data centers, server racks and even chip components with beams of light. (We profiled Intel's latest silicon photonics breakthrough in January.)

Intel has suggested that on-board silicon photonics technology remains years away, but the company has already produced systems dabbling with the technology.

We spied one board code-named Coalbrook (or CoalCreek: it was hard to tell as Intel's staff urged us away from the confidential systems) and another called Springville that had built-in optical modules. Both systems were identified as using Intel's upcoming CSI (common system interconnect) technology, which is the company's attempt to catch-up to AMD's Hypertransport/integrated memory controller technology.

For more on Intel's silicon photonics work, we urge you to check out the following interview with Mario Paniccia, a research director at Intel. You'll quickly get a picture of just how bright Intel's top staff are.

Here's Mario

How terable are these things?

Pundits have long hit Intel and AMD with critiques for pushing out ever-faster processors when Joe Public hardly needs a world-class chip to run basic applications.

We, however, tend to think that consumers and companies will always find new ways to gobble up cycles – usually once Microsoft makes it possible to do so. Such a scenario certainly holds true in the server game where there's an insatiable desire for more horsepower.

Will Intel's cute bubble popping software drive the need for these new chips? Well, no. But that's not really the point, is it? ®