Feeds

AMD, ATI and the GPU

Breaking the monopoly

Gartner critical capabilities for enterprise endpoint backup

Intel has taken the "black hole" concept to extremes. You have a central processor which takes time out from running your software to do the weirdest things. It doesn't just control communications: if you use the modem socket on today's PC, the audio tones sent from your computer are generated, blip by blip, by the central processor; and the returning tones are analysed, wave by wave, by the same processor. It is so powerful that it does all this inbetween processor cycles of running Windows.

The same goes for basic sound synthesis, and serial comms: a USB serial port is under bit-by-bit control of the CPU. And the question "why?" is easily answered: "Because it's much cheaper to have just the one chip, rather than a whole army of support processors."

That has simply not been true of graphics.

The graphics workload is enormous. Even before you start running video games, or displaying DVD movies, you're asking the silicon to handle far more than it can comfortably manage; in many PCs in fact there is more power in the graphics card than in the main processor. And that power is focused - it runs relatively few instructions, very quickly. They are all graphics-oriented... polygon handlers.

A standard Intel processor simply isn't optimised for that. The attempt has been made! - in the past, Intel has made several attempts to include multimedia extensions (MMX) in the central technology. It wasn't a success, for a number of reasons and at one point, Intel actually attempted to purge this error by stating that MMX did not stand for multi-media extensions. That was correct in the sense that nobody used them. Intel heavily promoted C++ compilers that would take advantage of them, but even that way didn't make them standard.

But that doesn't mean people didn't want multimedia extensions. They just wanted them off-chip.

Today's PC is far more multimedia oriented. High Definition graphics is "just coming" in television; it's actually exceeded by what top end PC graphics systems can manage - that's why a high-res flat screen for the PC is twice the price of the same size screen for TV and video. In the future, actual software will need to be written to switch, seamlessly and in real time, between the background business applications, and the graphics and video work done in the foreground - and the business applications may want to take direct advantage of the GPU.

And the GPU isn't just for drawing pictures. Talk to any crypto expert and you'll find they are all trying to find ways of harnessing that extraordinary power. Engineers and scientists are increasingly studying the use of GPUs for non-graphical calculations where their computations involve matrix and vector operations.

The black hole of the processor has, at last, started to attract the GPU and the GPGPU. AMD feels that it has to move, now, before it becomes part of Intel, rather than part of a generic processor platform. If it is right, then the question of "how much did you pay for ATI?" is irrelevant. It may be a question of "How can you expect to survive, without ATI?" ®

Boost IT visibility and business value

More from The Register

next story
Kate Bush: Don't make me HAVE CONTACT with your iPHONE
Can't face sea of wobbling fondle implements. What happened to lighters, eh?
The agony and ecstasy of SteamOS: WHERE ARE MY GAMES?
And yes it does need a fat HDD (or SSD, it's cool with either)
Apple takes blade to 13-inch MacBook Pro with Retina display
Shaves price, not screen on mid-2014 model
iPhone 6 flip tip slips in Aussie's clip: Apple's 'reversible USB' leaks
New plug not compatible with official Type-C, according to fresh rumors
Steve Jobs had BETTER BALLS than Atari, says Apple mouse designer
Xerox? Pff, not even in the same league as His Jobsiness
TV transport tech, part 1: From server to sofa at the touch of a button
You won't believe how much goes into today's telly tech
Apple analyst: fruity firm set to shift 75 million iPhones
We'll have some of whatever he's having please
Apple to build WORLD'S BIGGEST iStore in Dubai
It's not the size of your shiny-shiny...
NVIDIA claims first 64-bit ARMv8 SoC for Androids
Mile-High 'Denver' Tegra K1 successor said to rival PC performance
prev story

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Build a business case: developing custom apps
Learn how to maximize the value of custom applications by accelerating and simplifying their development.
Rethinking backup and recovery in the modern data center
Combining intelligence, operational analytics, and automation to enable efficient, data-driven IT organizations using the HP ABR approach.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.