Feeds

AMD, ATI and the GPU

Breaking the monopoly

Build a business case: developing custom apps

Intel has taken the "black hole" concept to extremes. You have a central processor which takes time out from running your software to do the weirdest things. It doesn't just control communications: if you use the modem socket on today's PC, the audio tones sent from your computer are generated, blip by blip, by the central processor; and the returning tones are analysed, wave by wave, by the same processor. It is so powerful that it does all this inbetween processor cycles of running Windows.

The same goes for basic sound synthesis, and serial comms: a USB serial port is under bit-by-bit control of the CPU. And the question "why?" is easily answered: "Because it's much cheaper to have just the one chip, rather than a whole army of support processors."

That has simply not been true of graphics.

The graphics workload is enormous. Even before you start running video games, or displaying DVD movies, you're asking the silicon to handle far more than it can comfortably manage; in many PCs in fact there is more power in the graphics card than in the main processor. And that power is focused - it runs relatively few instructions, very quickly. They are all graphics-oriented... polygon handlers.

A standard Intel processor simply isn't optimised for that. The attempt has been made! - in the past, Intel has made several attempts to include multimedia extensions (MMX) in the central technology. It wasn't a success, for a number of reasons and at one point, Intel actually attempted to purge this error by stating that MMX did not stand for multi-media extensions. That was correct in the sense that nobody used them. Intel heavily promoted C++ compilers that would take advantage of them, but even that way didn't make them standard.

But that doesn't mean people didn't want multimedia extensions. They just wanted them off-chip.

Today's PC is far more multimedia oriented. High Definition graphics is "just coming" in television; it's actually exceeded by what top end PC graphics systems can manage - that's why a high-res flat screen for the PC is twice the price of the same size screen for TV and video. In the future, actual software will need to be written to switch, seamlessly and in real time, between the background business applications, and the graphics and video work done in the foreground - and the business applications may want to take direct advantage of the GPU.

And the GPU isn't just for drawing pictures. Talk to any crypto expert and you'll find they are all trying to find ways of harnessing that extraordinary power. Engineers and scientists are increasingly studying the use of GPUs for non-graphical calculations where their computations involve matrix and vector operations.

The black hole of the processor has, at last, started to attract the GPU and the GPGPU. AMD feels that it has to move, now, before it becomes part of Intel, rather than part of a generic processor platform. If it is right, then the question of "how much did you pay for ATI?" is irrelevant. It may be a question of "How can you expect to survive, without ATI?" ®

The essential guide to IT transformation

More from The Register

next story
Reg man looks through a Glass, darkly: Google's toy ploy or killer tech specs?
Tip: Put the shades on and you'll look less of a spanner
So, Apple won't sell cheap kit? Prepare the iOS garden wall WRECKING BALL
It can throw the low cost race if it looks to the cloud
One step closer to ROBOT BUTLERS: Dyson flashes vid of VACUUM SUCKER bot
Latest cleaner available for world+dog in September
Samsung Gear S: Quick, LAUNCH IT – before Apple straps on iWatch
Full specs for wrist-mounted device here ... but who'll buy it?
Apple promises to lift Curse of the Drained iPhone 5 Battery
Have you tried turning it off and...? Never mind, here's a replacement
Now that's FIRE WIRE: HP recalls 6 MILLION burn-risk laptop cables
Right in the middle of Burning Mains Man week
Apple's iWatch? They cannae do it ... they don't have the POWER
Analyst predicts fanbois will have to wait until next year
Tim Cook in Applerexia fears: New MacBook THINNER THAN EVER
'Supply chain sources' give up the goss on new iLappy
HUGE iPAD? Maybe. HUGE ADVERTS? That's for SURE
Noo! Hand not big enough! Don't look at meee!
prev story

Whitepapers

Top 10 endpoint backup mistakes
Avoid the ten endpoint backup mistakes to ensure that your critical corporate data is protected and end user productivity is improved.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up distributed data
Eliminating the redundant use of bandwidth and storage capacity and application consolidation in the modern data center.
The essential guide to IT transformation
ServiceNow discusses three IT transformations that can help CIOs automate IT services to transform IT and the enterprise
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.