Feeds

Cool Fusion: AMD's plan to revolutionise multi-core computing

Different cores for different chores

Intelligent flash storage arrays

AMD naturally approached potential partners with graphics chip development expertise - long before the ATI and AMD first began discussing the possibility of a merger, in late December 2005/early January 2006, according to Hester - and while the ultimate choice was ATI, we'd be very surprised if AMD didn't talk to a number of ATI's competitors. In the glow of the post-merger honeymoon period, erstwhile ATI staffers paint a rosy picture of the two firms' highly aligned goals.

Certainly, ATI had developed an interest in processor technology having seen how researchers were increasingly beginning to turn to programmable GPUs to process non-graphical data. As Bob Drebin, formerly of ATI desktop PC products group but now AMD's graphics products chief technology officer, puts it, science and engineering researchers and users had become attracted to modern GPU's hugely parallel architecture and Gigaflop performance.

amd fusion: cpu vs gpu performance acceleration

A GPU typically takes a heap of pixel data and runs a series of shader program on each of them to arrive at a final set of colour values. As display sizes have grown, magnified by anti-aliasing requirements, GPUs have developed to process more an more pixels this way simultaneously.

But since the result of any of their pixel shader runs is a set of binary digits, and digital data can be interpreted in whatever way the programmer believes is meaningful, modern GPUs are not limited to pixels. Try fluid dyamics data instead, replacing pixels with particle velocities, and using shader code to calculate the effect on a given particle of all the nearby particles. The result this time is a number you interpret as a new velocity value rather than a colour.

That's real physics, but as both ATI and Nvidia have been touting during the past six months or so, it equally applies to the movement of objects and substances in a game.

Crucially, though, this doesn't mean the CPU isn't needed any more, Drebin says. Going forward, it's a matter of matching a given task to the processing resource - GPU or CPU - that will be able to crunch the numbers most quickly. But if you're looking at apps with a much higher demand for GPUs than CPUs, ATI's thinking went, maybe we should be designing products that also provide the more general purpose processing that CPUs do so well.

It's not hard to imagine, incidentally, Nvidia thinking on similar lines, particularly given the rumours that it's exploring x86 development on its own. It's hard to believe it has some scheme to break into the mainstream CPU market - as AMD's Hester says, there are really only two companies that make x86 processors, but "four or five that have tried and failed" - but general-purpose processing units tied to powerful GPUs and aimed at apps that need that balance of functionality.

The beauty of AMD's modular approach is that it allows it to produce not only more CPU-oriented processors but also the GPU-centric devices ATI had been thinking of. And all of them are aligned architecturally and guaranteed a baseline compatibility thanks to their adherence of the x86 instruction set.

amd fusion: the concept

Which will itself have to adapt, Hester says, to take on new extensions that will allow coders to access directly the features of the GPU - or, indeed, other modules that are brought on board as and when it makes economic sense to do so. The big users will be the OpenGL and DirectX API development teams, but other software developers are going to want it to access these future x86 extensions for other, non-graphical applications.

Top 5 reasons to deploy VMware with Tegile

More from The Register

next story
Don't wait for that big iPad, order a NEXUS 9 instead, industry little bird says
Google said to debut next big slab, Android L ahead of Apple event
Xperia Z3: Crikey, Sony – ANOTHER flagship phondleslab?
The Fourth Amendment... and it IS better
Ex-US Navy fighter pilot MIT prof: Drones beat humans - I should know
'Missy' Cummings on UAVs, smartcars and dying from boredom
Netscape Navigator - the browser that started it all - turns 20
It was 20 years ago today, Marc Andreeesen taught the band to play
A drone of one's own: Reg buyers' guide for UAV fanciers
Hardware: Check. Software: Huh? Licence: Licence...?
The Apple launch AS IT HAPPENED: Totally SERIOUS coverage, not for haters
Fandroids, Windows Phone fringe-oids – you wouldn't understand
Apple SILENCES Bose, YANKS headphones from stores
The, er, Beats go on after noise-cancelling spat
Here's your chance to buy an ancient, working APPLE ONE
Warning: Likely to cost a lot even for a Mac
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Win a year’s supply of chocolate
There is no techie angle to this competition so we're not going to pretend there is, but everyone loves chocolate so who cares.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Intelligent flash storage arrays
Tegile Intelligent Storage Arrays with IntelliFlash helps IT boost storage utilization and effciency while delivering unmatched storage savings and performance.