On the back-plane are a pair of dual-link DVI connectors and an s-video port. The card supports HDMI with the associated HDCP anti-copying system, and while there's no HDMI port on the card itself, expect to see reference board-based products come bundled with a DVI-to-HDMI dongle.
The 2900 XT's HDMI add-on
We'll also see a 30cm (12in) Radeon HD 1900 XTX with a massive 1GB of GDDR 4 memory, but it's only due to be made available to PC manufacturers as it will require an appropriate case and motherboard - imagine Barbara Windsor this time.
Long before the HD X2900 XT came to light there were rumours that R600 had a huge power draw of more than 200W. When we looked at the PowerColor's HD 2900XT and saw that it has two power connectors we noted that one is the usual six-pin while the other is a new eight-pin connector. You can safely plug six-pin connectors in to both points, so this is clearly a nod to the next generation of power supplies. However, it sounded alarm bells, so let's put the rumours to rest.
During our testing we found that the HD 2900 XT draws 70W at idle and 140W when it's working. An Nvidia GeForce 8800 Ultra has very similar figures: 80W at idle and and an operation power-draw of 130W. Given this similarity it's a shame that AMD couldn't match the quiet cooling solution used by Nvidia. The HD 2900 XT is whisper-quiet during Windows duties but when you start gaming the fan spins up to generate a noise level that's similar to a fan-assisted oven. It's not the most imaginative description but the tone is more insistent than a CPU cooler yet 'whine' would be far too harsh a word.
Both AMD and Nvidia have stressed that their DirectX 10 parts have the dual roles of playing back high-definition movies as well as bringing a new level of quality of gaming to the PC. This way of thinking caused us a few problems during testing as there can only be a tiny handful of technophiles who stream Blu-ray Disc or HD-DVD movies from their PC to their HDTV. AMD's Universal Video Decoder (UVD) video engine and support for AC3 5.1-channel audio over HDMI are all well and good but do you expect to use them in the near future?
Similarly, DirectX 10 games are on the way but right now the only games worth playing still use DirectX 9. AMD supplied a 1.2GB DirectX 10 demo and benchmark called Call of Juarez that ran a couple of times but mainly crashed and crashed again under 32-bit Windows Vista Ultimate. We ended up testing in the R600 under 32-bit Windows XP SP2 on an Abit AB9 Quad GT motherboard with Intel 975X chipset, a Core 2 QX6800 processor, 4GB of fast DDR 2 memory and a Western Digital 150 Raptor hard drive.
Quality and speed
Frantisek, HardOCP has done a bit of a demolition job on the 2900XT partly with regard to the different AA filters including the different tents.
They have included a number of screengrabs to illustrate the quality of AMD versus Nvidia and frankly they are pretty much identical in most respects however when it comes to AA the AMD card images look blurred. I have no intention of taking issue with HardOCP - in the main they do excellent work - however I have been in this situation way back in the days of Radeon 9800.
This was a time when AA was quite novel and I simply wanted to illustrate what it could do so. I used Microsoft Train Simulator as an example because the train tracks suffer from horrible jaggies as they converge towards the horizon. Enabling AA smoothed the image considerably so I set about taking screen grabs to illustrate the success.
In the images the rails looked good with AA but the trees that line the track looked like green lollipops with no distinguishing features and the grabs with AA therefore looked terrible. Much like the images on HardOCP..
Of course I was looking at stills from a moving image and was effectively missing the point. As you drive past a tree you can't see the leaves and branches. Stand under a tree in Oblivion and you get the benefit of all sorts of eye candy.
Ever since I have been very wary of taking screen grabs as they often show visual information that is out of context.
As for the most recent comment, yes it would have been 'fair' to run a pair of Ultras in SLI however I only had one Ultra - does anyone have two? - and simply wanted to illustrate how £500 of AMD hardware compared with £500 of Nvidia hardware.
Sli vs Crossfire?
Two things spring to mind:
- If you used Crossfire for the ATI, surely you should have included an SLi config for the nvidia setup, for a more balanced report?
- Is it just me or does Crossfire's performance benefits look feeble (on a %age increase in frame-rate basis) compared to typical performance gains from Sli?
Reply to ref Numbers, numbers
Fair enough, I respect your view and find it meaningful, thank you very much for reply.
I have seen some image quality comparisons lately and the usage of CFAA with Wide Tent in Oblivion with 2900XT seems to be useful a lot for overall quality, I'm looking forward to your further testing.