Original URL: https://www.theregister.com/2006/07/24/amd_etc/

AMD, ATI and the GPU

Breaking the monopoly

By Guy Kewney

Posted in Channel, 24th July 2006 13:27 GMT

Comment "We may lose business on Intel boards, but we will break the Intel monopoly." With these words, AMD's CFO Bob Rivet announced the takeover of graphics chip maker, ATI, offering a future of joined-up shared processing, split between CPU and GPU.

The deal, announced today, goes back some time. Last year, at Computex in Taipei, it was apparent that ATI and AMD were falling in love with the idea of using the powerful graphics processor to run computer programs, not just for animating video.

At that show, software developers were invited to the launch of the new dual-core AMD processors, with prototype applications that ran, not on the x86 central processor, but on the graphics chip. Examples included video editors which could handle the output stream live, in real time.

This concept is probably beyond the grasp of the typical financial analyst, and in the short term, the City and Wall Street will probably panic, seeing only the probability that ATI will lose customers who make Intel motherboards, coupled with the possibility that end-users who want Nvidia graphics will have to buy Intel.

"In 2008 and beyond, AMD aims to move beyond current technological configurations to transform processing technologies, with silicon-specific platforms that integrate microprocessors and graphics processors to address the growing need for general-purpose, media-centric, data-centric and graphic-centric performance," said the official statement.

Questions asked during the webcast today showed that this went clear over the heads of most business editors, who wanted to know where the immediate shareholder value arose, suggested that the $5.4bn prices paid for ATI was far too high, and wondered how AMD might get out of it if/when they realised it was a fatal mistake.

As to whether it is a mistake, is almost impossible to guess. Writing software for the graphics processor is a challenge, and those who have decided that the GPU is too darned powerful to leave drawing triangles, have come up short against problems - mostly, lack of software tools.

The obvious risk, is that the idea is non-standard. The market has shown many times just how much it loves the Intel x86 instruction set, and how little it likes "improvements" like the Intel Itanium 64-bit processor. It was, of course, AMD itself which went out on a limb and said: "We'll stick with x86 and design a 64-bit version of that architecture" rather than switch to the alternative 64-bit design. Why would AMD now think it can get away with the same mistake that has, after all, hit Intel pretty hard?

It's a good question. The basis of Intel's thinking has been described as "a black hole" by former IBM Fellow Glenn Henry, founder of Centaur: "It's clear that the processor is the black hole, and that all silicon is going to fall into it at some point. Integration is inevitable, in our business."

Henry was looking at the business from a different viewpoint: he wanted to justify selling out Centaur to Via Technology pioneer Wenchi Chen (Via now owns Centaur): "Those who can do a processor can control their system destiny, and those who don't will end up totally at the mercy of other people, who can shut them out of business right away," Henry explained.

And as an example, he said, when IDT bought Centaur in the first place, they were making a lot of money on SRAMs that went into caches on PC motherboards. "Two years later, there were no SRAMs on PC motherboards, because Intel put them on the die. That's going to happen to some of the chips today. All the other chips are gonna disappear at some point, and all that's left is the big chip, with the processor in the middle. You have to own that processor technology, or it won't be your chip."

Intel has taken the "black hole" concept to extremes. You have a central processor which takes time out from running your software to do the weirdest things. It doesn't just control communications: if you use the modem socket on today's PC, the audio tones sent from your computer are generated, blip by blip, by the central processor; and the returning tones are analysed, wave by wave, by the same processor. It is so powerful that it does all this inbetween processor cycles of running Windows.

The same goes for basic sound synthesis, and serial comms: a USB serial port is under bit-by-bit control of the CPU. And the question "why?" is easily answered: "Because it's much cheaper to have just the one chip, rather than a whole army of support processors."

That has simply not been true of graphics.

The graphics workload is enormous. Even before you start running video games, or displaying DVD movies, you're asking the silicon to handle far more than it can comfortably manage; in many PCs in fact there is more power in the graphics card than in the main processor. And that power is focused - it runs relatively few instructions, very quickly. They are all graphics-oriented... polygon handlers.

A standard Intel processor simply isn't optimised for that. The attempt has been made! - in the past, Intel has made several attempts to include multimedia extensions (MMX) in the central technology. It wasn't a success, for a number of reasons and at one point, Intel actually attempted to purge this error by stating that MMX did not stand for multi-media extensions. That was correct in the sense that nobody used them. Intel heavily promoted C++ compilers that would take advantage of them, but even that way didn't make them standard.

But that doesn't mean people didn't want multimedia extensions. They just wanted them off-chip.

Today's PC is far more multimedia oriented. High Definition graphics is "just coming" in television; it's actually exceeded by what top end PC graphics systems can manage - that's why a high-res flat screen for the PC is twice the price of the same size screen for TV and video. In the future, actual software will need to be written to switch, seamlessly and in real time, between the background business applications, and the graphics and video work done in the foreground - and the business applications may want to take direct advantage of the GPU.

And the GPU isn't just for drawing pictures. Talk to any crypto expert and you'll find they are all trying to find ways of harnessing that extraordinary power. Engineers and scientists are increasingly studying the use of GPUs for non-graphical calculations where their computations involve matrix and vector operations.

The black hole of the processor has, at last, started to attract the GPU and the GPGPU. AMD feels that it has to move, now, before it becomes part of Intel, rather than part of a generic processor platform. If it is right, then the question of "how much did you pay for ATI?" is irrelevant. It may be a question of "How can you expect to survive, without ATI?" ®