This article is more than 1 year old

AMD, ATI and the GPU

Breaking the monopoly

Intel has taken the "black hole" concept to extremes. You have a central processor which takes time out from running your software to do the weirdest things. It doesn't just control communications: if you use the modem socket on today's PC, the audio tones sent from your computer are generated, blip by blip, by the central processor; and the returning tones are analysed, wave by wave, by the same processor. It is so powerful that it does all this inbetween processor cycles of running Windows.

The same goes for basic sound synthesis, and serial comms: a USB serial port is under bit-by-bit control of the CPU. And the question "why?" is easily answered: "Because it's much cheaper to have just the one chip, rather than a whole army of support processors."

That has simply not been true of graphics.

The graphics workload is enormous. Even before you start running video games, or displaying DVD movies, you're asking the silicon to handle far more than it can comfortably manage; in many PCs in fact there is more power in the graphics card than in the main processor. And that power is focused - it runs relatively few instructions, very quickly. They are all graphics-oriented... polygon handlers.

A standard Intel processor simply isn't optimised for that. The attempt has been made! - in the past, Intel has made several attempts to include multimedia extensions (MMX) in the central technology. It wasn't a success, for a number of reasons and at one point, Intel actually attempted to purge this error by stating that MMX did not stand for multi-media extensions. That was correct in the sense that nobody used them. Intel heavily promoted C++ compilers that would take advantage of them, but even that way didn't make them standard.

But that doesn't mean people didn't want multimedia extensions. They just wanted them off-chip.

Today's PC is far more multimedia oriented. High Definition graphics is "just coming" in television; it's actually exceeded by what top end PC graphics systems can manage - that's why a high-res flat screen for the PC is twice the price of the same size screen for TV and video. In the future, actual software will need to be written to switch, seamlessly and in real time, between the background business applications, and the graphics and video work done in the foreground - and the business applications may want to take direct advantage of the GPU.

And the GPU isn't just for drawing pictures. Talk to any crypto expert and you'll find they are all trying to find ways of harnessing that extraordinary power. Engineers and scientists are increasingly studying the use of GPUs for non-graphical calculations where their computations involve matrix and vector operations.

The black hole of the processor has, at last, started to attract the GPU and the GPGPU. AMD feels that it has to move, now, before it becomes part of Intel, rather than part of a generic processor platform. If it is right, then the question of "how much did you pay for ATI?" is irrelevant. It may be a question of "How can you expect to survive, without ATI?" ®

More about

TIP US OFF

Send us news


Other stories you might like