The reason for this tiny overclocking headroom is that AMD has set the core voltage to 1.105V and the memory voltage to 1.1V. For this reason, AMD issued copies of its ATI OverVolt tool to favoured reviewers but not us, or punters. We are told this software allows you to raise the voltages to 1.1625V and 1.15V in a single step as on On/Off switch.
Sapphire's Redline utility lets you up the GPUs' voltages
We installed Redline tool which allows you to increase the voltages in steps. We immediately hit a snag when the utility refused to run as, apparently, it was developed on Windows Vista and doesn’t much like Windows 7. We created a short cut to the utility and set it to run in compatibility mode with Windows XP and, lo, all was well.
We ramped up the voltages to the max and were able to overclock to 890MHz/4600MHz which means we effectively restored the HD 5870 clock speeds. We measured an increase in power draw at the mains plug from 350W to 380W. That’s a modest amount of extra power but you have to wonder just how much more load the next generation of DirectX 11 games will place on this type of hardware.
Put it this way, we strongly suspect that AMD has chosen the voltage settings and clock speeds with very good reason. Overclockers, proceed with caution.
If you prefer to throw caution to the wind you might like to consider the Asus HD 5970 Voltage Tweak which allows a core voltage of 1.35V and has a claimed overclocking potential of 950MHz core and 5012MHz memory. And presumably you can also use it to grill sausages.
Spending more than £500 on a graphics card is a serious decision. When you consider that we thought the HD 5870 was over the top, it will come as no surprise that we classify the HD 5970 as a frivolous toy. But, darn it, what a toy. ®
Pricey, but for me there is an advantage
Ever since I built this machine back years ago when it was running two nVidia 7900GTXs (and heating my room at the same time) I've had a Matrox Triplehead2Go monitor splitter to give me three-screen gaming (and loads of screen space for programming IDEs). Not every game ran well with it - in fact, while a lot of games list the resolution as available in the options screen, actually selecting it completely knackers the perspective and makes the game unplayable. Escape From Butcher Bay is a good example.
With Eyefinity, I could dump my extortionately-priced and annoyingly analogue first-generation external splitter box on eBay, buy a couple of adaptors for my existing monitors, and use the money to soup the machine up even further. Say, with a huge, expensive, completely over the top graphics card? Hell yeah...
Definitely something I'll be looking into. So long as it doesn't require the same number of wires and fecking about as my existing setup - and considering this thing already has 3 output connectors without any external boxes at all, that's quite likely - I would be very interested.
Review makes no sense..
The price is totally irrelevent and it's still cheaper than an nVidia card that costs more and isn't as powerful.
So how did it get a 65% score exactly?
Anybody that buys a card that's this powerful (me included) isn't going to give a damn about the price.. except in the knowledge you're going to get more bang-for-buck than with nvidia..
Seriously this review is just wrong.. No really.
You review what it is - arguably the most powerful single card money can buy, and if you can get comparable for cheaper (even nVidia could actually do cards even close to this powerful) /then/ you start knocking points off...
You put two GTX 295's in your PC, it's not going to be as powerful, it's going to use up at least 4 slots in your case and it's going to cost you 800 quid before you even get started with the 15TW PSU you're going to need.
Seriously - what's the deal? I mean really where is the nVidia comparison anyways?
While I'm ranting..
"so these figures might be sustainable provided we could stand having the cooling fan running at full tilt during a gaming session"
Firstly most people are going to water cool and secondly.. Yes, if you play games without sound it's going to get annoying but who does that..
If we're going to be talking about the sound it makes - what's it like at idle? If in a normal environment when it's not being pounded and it's quiet, that's all that matters.
I have always said there aren't enough computer rendered female rastafarian models in the graphics card industry.
I always worry...
... that if I watch late night TV and I get bombarded by the inevitable 'chatline' adverts aimed at the sad and lonely I have actually become part of that target audience simply by being there.
By the same token if I bought this I would become part of the group of buyers likely to be swayed by pictures of sci-fi body armoured young women on hi-tech equipment. I'm not sure which group is sadder.