Feeds

Intel puts cloud on single megachip

One die, 48 cores

Secure remote control for conventional and virtual desktops

Intel's research team has unveiled a 48-core processor that it claims will usher in a new era of "immersive, social, and perceptive" computing by putting datacenter-style integration on a single chip.

And, no, it's not the long-awaited CPU-GPU mashup, Larrabee. This processor, formerly code-named Rock Creek and now known by the more au courant moniker of Single-chip Cloud Computer (SCC), is a research item only.

As Intel CTO Justin Rattner emphasized during his presentation (PDF) on Wednesday to reporters in San Francisco, "This is not a product. It never will be a product." But the SCC does provide an insight into the direction into which Intel is heading - and the path the company is treading is many-cored.

Rattner characterized the many-core future to be "more perceptive," saying that "The machines we build will be capable of understanding the world around them much as we do as humans. The will see, and they will hear, they will probablly speak, and do a number of other things that resemble human-like capabilities. And they will demand, as a result, very substantial computing capability."

Intel Single-chip Cloud Computer die

Not just 48 cores - 48 Intel Architecture cores

But the ancestor of those future chips, the SCC, is up and running today - as Rattner proudly pointed out while displaying a multi-die manufacuring wafer. "We're beyond the wafer level. [We have] packaged and running parts. This is not the typical Intel 'flash the wafer and then wait six months'."

The SCC is the second-generation experimental processor in Intel's Tera-scale Computing Research Program, the first being the 80-core Polaris, which it demoed in 2007.

While a move from 80 to 48 cores may seem like a step backwards, the SCC has one massive advantage over Polaris: its cores are fully IA-compliant. Polaris was a specialized beast, purely a proof-of-concept part. The SCC, by contrast, can do actual work - which Rattner and his crew proudly demoed.

One of the demos pointed directly towards the SCC's practical focus: Hadoop's Mahout machine-learning tools running an object-categorization task on the SCC with only minimal tweaking. As Mike Ryan, a software engineer from Intel Research Pittsburgh, explained to The Reg, "I didn't have to change any software. The only thing I had to do was permute some of the memory-configuration options as well as well as the distributed file-system options."

Other demos included the SCC running the compute-intensive Black-Scholes financial modeling app, a JavaScript-based 3D modeling app, and Microsoft Visual Studio compiling code for the chip's parallel-processing environment.

In other words, the SCC ran off-the-shelf, real-world software thanks to its IA compliance, and functioned in the Hadoop demo as a datacenter-on-a-chip. "The move to Intel Architecture–compatible cores gives us an opportunity to make more ambitious efforts on the programming side," Rattner said.

At 567mm2 and 1.3 billion transitors, the SCC is a hefty chip, but Rattner claims that as its performance scales - both frequency and voltage can be tweaked in real time - the SCC dissipates between 25 and 125W.

The SCC's 48 IA-32 cores were described by Rattner as "Pentium-class cores that are simple, in-order designs and not sophisticated out-of-order processors you see in the production-processor families - more on the order of an Atom-like core design as opposed to a Nehalem-class design."

Tech specs for the 45nm CMOS high-k metal gate part include four DDR3 channels in a 6-by-4 2D-mesh network. The cores communicate by means of a software-configurable message-passing scheme using 384KB of on-die shared memory.

The SCC was designed by a 40-person research team of collaborating software and hardware engineers with members in Braunschweig, Germany; Bangalore, India; and Hillsboro, Oregon. As Rattner joked, "Not only did we manage to do somewhat over a billion transistors, but we did it on three continents in time zones that are roughly 10 to 12 hours apart - in one sense, somebody was working on it 24 hours a day."

Perhaps some day in the many-core future, those 40 engineers will be supplemented by seeing, hearing, and speaking computing assistants with "human-like capabilities." ®

Top 5 reasons to deploy VMware with Tegile

More from The Register

next story
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
Turnbull should spare us all airline-magazine-grade cloud hype
Box-hugger is not a dirty word, Minister. Box-huggers make the cloud WORK
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
prev story

Whitepapers

Driving business with continuous operational intelligence
Introducing an innovative approach offered by ExtraHop for producing continuous operational intelligence.
Why CIOs should rethink endpoint data protection in the age of mobility
Assessing trends in data protection, specifically with respect to mobile devices, BYOD, and remote employees.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Protecting against web application threats using SSL
SSL encryption can protect server‐to‐server communications, client devices, cloud resources, and other endpoints in order to help prevent the risk of data loss and losing customer trust.