Deep inside AMD's master plan to topple Intel
Back to the top on a radical GPU
AMD's new graphics architecture isn't merely about painting prettier pictures. It's about changing the way computers compute.
As first revealed last month at AMD's Fusion Developer Summit, the chip designer has gone out of its way to ensure that its future APUs – accelerated processing units, which is what the company calls it CPU/GPU mashups – don't merely relegate the CPU and GPU to being neighbors sharing the same slice of silicon. It seeks to make the CPU and the GPU full partners in whatever a computer's operating system and apps can throw at them.
The idea of this workload sharing, sometimes called general-purpose GPU computing (GPGPU), first gained public awareness in 2004, when a group of seven computer-science researchers from California's Stanford University presented a paper at that year's SIGGRAPH conference on a programming environment called Brook for GPUs. That research was the seed of what has became known as GPU compute, general-purpose computing on GPUs, or simply GPGPU.
That effort to leverage the massive parallelism of GPUs to offload appropriate compute tasks from CPUs really began to take off in 2007, according to AMD graphics CTO Eric Demers, accelerating in 2009 as DirectCompute and OpenCL began to win adherents.
Still, the ATI – now AMD – GPUs of the time were based on a relatively straightforward VLIW (very long instruction word) architecture that was designed and tuned for graphics. GPGPU usage was secondary. "This architecture – when we start using it for generalized compute – works okay," Demers said at the Fusion Developer Summit, "but it leverages the graphics."
In the last 12 to 18 months, however, the importance of a GPU being able to work closely with a CPU has become more important in AMD's designs. Graphics performance is still the central goal, but there are other goals as well."We are making significant optimizations for compute," Demers says. "We are looking at things differently."
Demers offers AMD's January 2011 HK-2207 demo – which, incidentally, was ridiculed by an Nvidia exec – as an example of this new thinking. "This demo uses GPU to do all the particle physics," he says. "It uses deferred lighting with G-Buffers, it does post-processing as a compute operation – it does very complex operations. And what is compute and what is graphics is blurred."
Graphics meet compute. Compute meets graphics
AMD has revealed a roadmap of its effort to continue the blurring of the line between graphics and compute, dubbed the Fusion System Architecture (FSA). This ambitious wish list seeks to combine the CPU and GPU into a single computing team by simplifying the programming model, unifying the memory being accessed by CPU and GPU, lowering the latency of task dispatching between CPUs and GPUs, and other enhancements.
The first column of AMD's FSA roadmap is essentially complete. Work on the others is underway (click to enlarge)
To accomplish all the goals of the FSA, Demers says, requires rethinking the GPU core. And so over the past three years AMD engineers have being doing just that, with the result being the company's next-generation graphics architecture, dubbed Graphics Core Next (which, of course, has its own TLA: GCN. The GCN is a fundamental piece-by-piece rethinking of the former ATI/AMD architectures.
Next page: The heterogeneous future
COMMENTS
Thats always the odd thing.
AMD moans about Intel but Intel does one thing that AMD never does...Advertise.
So whan folks go to buy a PC they get a choice of Intel or AMD.
Well they've never heard of AMD but they get to hear the Intel jingle at least three times a day on TV so they buy from the one brand they have heard of.
Simple.
AMD needs to sack their marketing team, get a new one and start spending on some jingles etc. after all there is only one other company doing it in their field so how hard can it be?
Even Acer has adverts in the UK.
As for no-one wanting AMD, I build my budget PC boxes with AMD CPUs in them. Why? The saving of using AMD enables me to put a 60GB SSD in the box. Customers dont notice the difference between an Athlon II or a i5 but they notice the SSD. They also get USB3.0/HDMI/usable integrated graphics, which they dont get on the budget Intel motherboards.
Well, duh!
You, sir, are of the highest intelligence because you happen to have, with the unusual sharpness of a Moriarty-like mind, recognized, in a flash as it were, what capitalism is all about.
Except for the "all but irrelevant", "things that nobody wants" and "simply don't perform" parts.
Have a cookie. It's a bit mouldy.
agreed.
they revolutionised CPUs. i remember how great my first Athlon was compared to the intel chips of the time, and cheap too!
We should all thank AMD.......
for without AMD Intel would still be selling 500mhz Pentium 5 at $1000.00 per unit.
AMD is far from irrelevant. Now VIA is irrelevant.
AMD is taking RADEON and changing the core of computing.
Intel has Sandy Bridge which is a core duo glued to crummy graphics.
Intel has no graphics capability. RADEON was an expensive purchase but just maybe they knew what they were doing when AMD overpaid for it.

