Intel introduces Sandy Bridge chippery
Swiss Army processor
IDF Intel took the wraps off its new Sandy Bridge microarchitecture Monday morning — now officially branded as the 2nd Generation Intel Core Processor — revealing a number of notable improvements over its current Nehalem-based processor line, including what the company claims are greatly improved on-chip integrated graphics.
"Sandy Bridge will revolutionize PCs — again," said Intel president and CEO Paul Otellini during his keynote presentation at the Intel Developer Forum in San Francisco. "On one single chip, we've put in place all the critical capabilities for computing."
Like the Westmere parts promoed last December and introduced at the Consumer Electronics Show in January, Sandy Bridge chips will be built using Intel's 32-nanometer process. In Intel parlance, Sandy Bridge is a "tock" — a new architecture on an existing process, while Westmere was a "tick" — an existing (Nehalem) architecture on a new (32nm) process.
While Westmere had a 45nm GPU inside the same package as its 32nm CPU, Sandy Bridge brings the GPU circuitry onto the same 32nm die, along with the memory controller, a dedicated video transcoder, and the CPU's improved media-handling 256-bit-wide SIMD unit called AVX — advanced vector extensions. As might be guessed, a 256-wide AVX SIMD should be able to process media-centric data at least twice as fast as Intel's previous 128-bit SSE SIMD
Note that Intel isn't the only chip baker to be moving up to AVX — support for at least a hefty chunk of AVX instructions are also planned for inclusion in AMD's upcoming burly Bulldozer, set for release next year.
AMD's long-delayed Fusion line of processors will also share another core (no pun intended) Sandy Bridge feature: an on-die GPU. Sandy Bridge's GPU is a three-part system, the first part being a collection of multiple execution units, with the number of units to vary by processor level — six and twelve had been reported in pre-IDF leaks.
Second is the media-processing unit, which will handle both video decoding and encoding. Dadi Perlmutter, headman of Intel's architecture group, made much of this capability in his Sandy Bridge discussion, speaking of how users want video encoding to take place in seconds, not minutes, and how he claims that Sandy Bridge's video-transcoding capabilities will deliver that level of performance.
The third part of the GPU is 3D processing. Both the GPU and the CPU cores (Monday's demos showed a four-core part) communicate with a shared "last level cache" over a ring bus à la Intel's ill-fated Larrabee discrete-graphics CPU/GPU mash-up, which never saw the light of day as anything other than a development platform.
Sandy Bridge, to no one's surprise, will also support Intel's two-threads-per-core Hyper-Threading technology, as well as its Turbo Boost tech, which allows one or more cores to be boosted when other cores are idle, or even all cores together to get a bit of extra juice when needed — even when doing so would briefly exceed that processor's thermal envelope.
One nifty feature of Sandy Bridge's turbo capability is that the CPU and GPU can be "turboed" separately. Need extra GPU power for gaming? Turbo the GPU. Doing some sophisticated number-crunching? Turn down the GPU and juice the CPU. All of this turbo management happens in a split second but further details will have to wait for a tech track less marketing-focused than a keynote.
Otellini and Perlmutter's morning performance was merely Sandy Bridge's unveiling. Throughout IDF's first day, the microprocessor track will be all-Sandy, all the time, with an assortment of sessions peeling back successive layers of the new architecture. Expect more Sandy Bridge details from your Reg reporter after he takes a deeper dive into those sessions to shake off the "gee-whiz" flash 'n' dazzle of the opening keynote. ®
Motherboard manufacturers will be pleased to know that Sandy Bridge, as we reported in April, will require yet another new socket, the LGA-1155 — which follows the LGA-1366 for the original Core i7 and the LGA-1156 for Lynnfield/Clarkdale. Do Otellini and Perlmutter own stock in MSI or Asus? (Just kidding, just kidding...)
But I really can't help thinking of the old Absolutely sketches about that Scottish town council every time I see "Sandy Bridge"
All this and?
The graphics will still be siht!
Paris, she always looks good in hi-res but not on this shady-bridge...
I keep hearing how bits of hardware will revolutionise the computer world, but it's largely untrue.
These are just minor iterations and speed increases. In effect it's just comparable to the day they added an FPU into a processor.
The addition of an FPU made certain operations faster, but unless you were a ray tracer you probably didn't notice. The processors were that slow back then (25Mhz) that a dedicated floating point unit made sense. Processors are much faster now.
So will a GPU inside a CPU really make that much difference? maybe it will be cheaper. A single fan for the CPU and GPU. But what about multiple displays and dual cards linked for performance?
I think the only revolutions are in computer form factor and software. The hardware used is largely secondary.
I'll be reserving judgement
Going on Intel's treack history of graphics adapter performance (including the ones they made Vista Basic for) I'll be sticking to my AMD/nVidia for the foreseeable future
So what happens when I want to do some OpenCL shit?
Do I Turbo Boost the CPU, or do I Turbo Boost the GPU? Or both? Or neither?