PowerColor Radeon HD 2900 XT graphics card
Worth the wait?
Review AMD/ATI has been working on its DirectX 10 graphics chip, codenamed 'R600', for an awfully long time, which is somewhat surprising when you consider that this is its second-generation DirectX 10 part. The first was the Xenos chip that powers Microsoft's Xbox 360 games console.
Although AMD is announcing a full range of Radeon HD 2400, 2600 and 2900  chips today, the only chip that is actually available is the Radeon HD 2900 XT. It won't make the mainstream HD 2400 and 2600 parts available until the end of June.
Priced at £249/$399, the HD 2900 XT is aimed squarely at Nvidia's GeForce 8800 GTS  so we're in the upper ranges of gaming performance but this is the first time in living memory that a manufacturer hasn't claimed the 'Fastest Ever' performance crown with a new generation of chips.
The HD 2900 XT reference board measures in at 23.8cm (9.5in) in length and has an impressive specification. Feel free to imagine a Sid James laugh  at this point. Fabbed at 80nm, the chip contains 700m transistors, no small number of which comprises its 320 'Stream' unified shader processors. The core's clocked to 740MHz, while the memory - 512MB of GDDR 3, connected across a 512-bit, eight-channel bus - runs at an effective 1650MHz. There are connectors for CrossFire and a second power port.
On the back-plane are a pair of dual-link DVI connectors and an s-video port. The card supports HDMI with the associated HDCP anti-copying system, and while there's no HDMI port on the card itself, expect to see reference board-based products come bundled with a DVI-to-HDMI dongle.
The 2900 XT's HDMI add-on
We'll also see a 30cm (12in) Radeon HD 1900 XTX with a massive 1GB of GDDR 4 memory, but it's only due to be made available to PC manufacturers as it will require an appropriate case and motherboard - imagine Barbara Windsor  this time.
Long before the HD X2900 XT came to light there were rumours that R600 had a huge power draw of more than 200W. When we looked at the PowerColor's HD 2900XT and saw that it has two power connectors we noted that one is the usual six-pin while the other is a new eight-pin connector. You can safely plug six-pin connectors in to both points, so this is clearly a nod to the next generation of power supplies. However, it sounded alarm bells, so let's put the rumours to rest.
During our testing we found that the HD 2900 XT draws 70W at idle and 140W when it's working. An Nvidia GeForce 8800 Ultra has very similar figures: 80W at idle and and an operation power-draw of 130W. Given this similarity it's a shame that AMD couldn't match the quiet cooling solution used by Nvidia. The HD 2900 XT is whisper-quiet during Windows duties but when you start gaming the fan spins up to generate a noise level that's similar to a fan-assisted oven. It's not the most imaginative description but the tone is more insistent than a CPU cooler yet 'whine' would be far too harsh a word.
Both AMD and Nvidia have stressed that their DirectX 10 parts have the dual roles of playing back high-definition movies as well as bringing a new level of quality of gaming to the PC. This way of thinking caused us a few problems during testing as there can only be a tiny handful of technophiles who stream Blu-ray Disc or HD-DVD movies from their PC to their HDTV. AMD's Universal Video Decoder (UVD) video engine and support for AC3 5.1-channel audio over HDMI are all well and good but do you expect to use them in the near future?
Similarly, DirectX 10 games are on the way but right now the only games worth playing still use DirectX 9. AMD supplied a 1.2GB DirectX 10 demo and benchmark called Call of Juarez  that ran a couple of times but mainly crashed and crashed again under 32-bit Windows Vista Ultimate. We ended up testing in the R600 under 32-bit Windows XP SP2 on an Abit AB9 Quad GT motherboard with Intel 975X chipset, a Core 2 QX6800 processor, 4GB of fast DDR 2 memory and a Western Digital 150 Raptor hard drive.
The PowerColor HD 2900XT that we're reviewing here is the dead spit of the reference sample that we got from AMD so we were also able to run the 2900 XTs in CrossFire. Consequently, we also compared £498 of dual Radeon HD 2900 XT with a £485 Asus EN8800 Ultra - as the name suggests, based on the GeForce 8800 Ultra. The drivers we used didn't include the Overdrive overclocking feature, and 3DMark06 doesn't recognise the hardware so we were instructed to use a switch to get the benchmark to run.
3DMark06 benchmark results
Longer bars are better
The results in 3DMark06, Half-Life 2 and Elder Scrolls: Oblivion make it plain that this hardware has no place connected to a low-resolution screen. With every quality setting enabled, the 2900 XT performs extremely capably, and with two cards in CrossFire it gave the 8800 Ultra a considerable spanking.
Half-life 2 Lost Coast benchmark results
Frame rates - longer bars are better
Elder Scrolls: Oblivion benchmark results
Frame rates - longer bars are better
This leaves DirectX 10 gaming as a great unknown and we simply won't be able to answer that question for a few months to come - or just as soon as a rather more stable version of Call of Juarez appears.
We also can't investigate the new features of HD 2900 XT such as Custom Filter Anti-Aliasing (CFAA). Modes run from 4x CFAA - which is 2x anti-aliasing run through a Narrow Tent filter - to 24x CFAA, which is 8x anti-aliasing with Edge Detection, which sounds superb but until we see them in action it's not much more than techno-babble.
Looking back at our DirectX 9 testing figures, we can extrapolate that the HD 2900 XT will indeed match an 8800 GTS, so AMD is delivering fair value for money. On the down side, it's clear to us that AMD has been unable to extract the performance that it wanted from R600 despite the employment of enormous core and memory speeds.
Nvidia grabbed the high end with 8800 GTX and Ultra, while AMD struggled to get the HD 2900 XT out of the factory door. Now it's here, and with boards priced at around £249, the Radeon HD 2900 XT take the fight to the GeForce 8800 GTS, and it's a very even battle. Hopefully, we'll have a clearer idea of their respective abilities when DirectX 10 games are published.