AMD ATI Radeon HD 5970 two-GPU graphics card
Serious performance, serious price
Review Cast your eye over our news piece on AMD's ATI Radeon HD 5970 and our review of the HD 5870 and you’ll have the essential information at your fingertips. AMD has, for some unknown reason, changed its naming convention, so this two-chip HD 5870-based graphics card has been named HD 5970 instead of the more predictable HD 5870 X2.
Sapphire's Radeon HD 5970: overclocked, slightly
What we have here is a very long graphics card with two HD 5870 GPUs that are linked by a PCI Express 2.1 bridge chip along with two groups of GDDR 5 memory that total 2GB. A single HD 5870 chip packs 1600 Stream processors into one 40nm core, so the HD 5970 has a startling total of 3200 shaders.
The key features have been carried over from the HD 5870, including support for DirectX 11, triple monitor support with ATI Eyefinity and some nifty power-saving technology. That said, although AMD has worked wonders to reduce the power draw of the HD 5000 series at idle, the fact remains that the two chips draw plenty of power when they are under load.
The mid-range HD 5770 has a figure of 108W, the HD 5850 draws 151W and the HD 5870 is rated at 188W. If AMD had kept the speeds and feeds for the HD 5970 at the same 850MHz/4800MHz used by the HD 5870 then it's reasonable to estimate the loaded power figure would have climbed to 360W. AMD tells us that the enormous cooling package used on HD 5970 can handle 400W so these figures might be sustainable provided we could stand having the cooling fan running at full tilt during a gaming session.
AMD decided to keep the maximum power figure below 300W by reducing the voltage fed to the graphics core and memory. This led to a reduction in clock speeds to 725MHz/4000MHz - speeds familiar to anyone who owns or has read about the HD 5850. This means that the HD 5970 is something of a mongrel: a kind of Radeon HD 5850 X2 with extra shaders. The loaded power figure for the HD 5970 is 294W which is pleasingly close to double the 151W for the HD 5850.
The port in the middle is the Mini DisplayPort
This emphasis on the control of power heat and its dispersal has dictated the layout of the graphics card. The exhaust vent for the cooling package fills the upper half of the dual-slot bracket while the three graphics connectors are arranged in a row underneath. There are two dual-link DVI outputs along with a mini Display Port. The standard AMD package includes an adaptor to convert the mini DisplayPort to a full-size DisplayPort.
Pricey, but for me there is an advantage
Ever since I built this machine back years ago when it was running two nVidia 7900GTXs (and heating my room at the same time) I've had a Matrox Triplehead2Go monitor splitter to give me three-screen gaming (and loads of screen space for programming IDEs). Not every game ran well with it - in fact, while a lot of games list the resolution as available in the options screen, actually selecting it completely knackers the perspective and makes the game unplayable. Escape From Butcher Bay is a good example.
With Eyefinity, I could dump my extortionately-priced and annoyingly analogue first-generation external splitter box on eBay, buy a couple of adaptors for my existing monitors, and use the money to soup the machine up even further. Say, with a huge, expensive, completely over the top graphics card? Hell yeah...
Definitely something I'll be looking into. So long as it doesn't require the same number of wires and fecking about as my existing setup - and considering this thing already has 3 output connectors without any external boxes at all, that's quite likely - I would be very interested.
Review makes no sense..
The price is totally irrelevent and it's still cheaper than an nVidia card that costs more and isn't as powerful.
So how did it get a 65% score exactly?
Anybody that buys a card that's this powerful (me included) isn't going to give a damn about the price.. except in the knowledge you're going to get more bang-for-buck than with nvidia..
Seriously this review is just wrong.. No really.
You review what it is - arguably the most powerful single card money can buy, and if you can get comparable for cheaper (even nVidia could actually do cards even close to this powerful) /then/ you start knocking points off...
You put two GTX 295's in your PC, it's not going to be as powerful, it's going to use up at least 4 slots in your case and it's going to cost you 800 quid before you even get started with the 15TW PSU you're going to need.
Seriously - what's the deal? I mean really where is the nVidia comparison anyways?
While I'm ranting..
"so these figures might be sustainable provided we could stand having the cooling fan running at full tilt during a gaming session"
Firstly most people are going to water cool and secondly.. Yes, if you play games without sound it's going to get annoying but who does that..
If we're going to be talking about the sound it makes - what's it like at idle? If in a normal environment when it's not being pounded and it's quiet, that's all that matters.
I have always said there aren't enough computer rendered female rastafarian models in the graphics card industry.
I always worry...
... that if I watch late night TV and I get bombarded by the inevitable 'chatline' adverts aimed at the sad and lonely I have actually become part of that target audience simply by being there.
By the same token if I bought this I would become part of the group of buyers likely to be swayed by pictures of sci-fi body armoured young women on hi-tech equipment. I'm not sure which group is sadder.