This choice of cooling system has allowed Sapphire to fit four DVI-I connectors to the bracket - overkill for most of us but it will doubtless fill some people with joy. The Sapphire package includes a DVI-to-HDMI adaptor, a DVI-to-VGA adaptor, a composite-video adaptor and a component-video adapter.
In addition there is a decent collection of CyberLink software, consisting of PowerDVD 7 and CyberLink DVD Suite, which is made up of PowerProducer 4, PowerDirector 5 Express, Power2GO 5.5 and Medi@show 3.
AMD's ATI Radeon HD 4870 X2: scarily high power draw
We tested the Sapphire back-to-back with a reference 4870 X2 that sells for £399 and a reference GTX 280 that is typically available for £330. We used the Asus P6T Deluxe motherboard with Intel i7 965 Extreme 'Nehalem' CPU and 3GB of 1066MHz of DDR 3 with Windows Vista Ultimate Edition.
We tested the GeForce GTX 280 with Nvidia’s latest 180.44 beta driver, which boosts the card in 3DMark Vantage by offloading the PhysX workload from the CPU to the GPU. This raises the score for the CPU element of the test by a huge amount which, in turn, raises the overall score and appears to give the GTX 280 the same performance as the overclocked 4850 X2.
You’ll probably get a more accurate impression of the GTX 280 if you look at the GPU results in 3DMark Vantage or instead look at 3DMark06 where you can see that performance drops off sharply when anti-aliasing is added to the workload.
Nvidia's GeForce GTX 280: impressive power-saving features
Benchmark runs in Far Cry 2 show the GTX 280 in a very good light, due at least in part to this being an Nvidia-sponsored game but also because it's a darn fine graphics card. The performance figures can be interpreted in a number of ways, but there's no denying that the power-saving features of GTX 280 are very impressive. It only has a power draw of 135W in Windows, when it is practically silent, and draws 280W when the system was running at full pelt.
The two 4870 X2 bars in the charts have the same description, I assume the second is overclocked? However some of the benchmark/timing results for various cards are clearly spurious compared to their overall performance differentials. I suggest you revisit this.
ATI Linux driver support has improved dramatically from several years ago. I have a number of machines with ATI cards (3870, 24-something, a 690 based uATX board with integrated graphics and a Dell with embedded ATI graphics). The ATI installer is pretty bulletproof and the performance is good. Not at all like back in the days of the 9700. I used to worry about using ATI with Linux: I do not anymore.
I have read that neither ATI nor Nvidia supports CF/SLI under Linux. I don't know if it is true. Can anyone confirm or refute?
The driver is built around a kernel module and if the kernel is updated then the graphics driver must be reinstalled, which is a pain. But Nvidia is the same in this regard.
FWIW, ATI's CAL or Stream Computing SDK also runs on Linux now. As much as I have played with it, it seems to work well (I have not had as much experience with CAL as with CUDA though.)
AFAIK - as of about 8 months ago - neither CAL nor CUDA will use both GPUs if the GPUs are in Crossfire/SLI mode. This would be a problem with ATI or Nvidia. If the cards are not paired up then both are available - I've verified this with an Nvidia 8600 and 8800 in the same box, and it is what ATI says in it's docs.
Looks as if
ATI may finally be getting their act together. When I got to rebuild (my now three year old) game rig I may have to take a look at their cards in comparison to nVidia, thanks.
165W in Windows?
Quick Linux compatibility check?
Given that you've probably got quite a few Linux users in your readership, would it be possible to do a brief check of hardware like this on Linux?
Nvidia kit generally has very good OpenGL support, and is very good with simulators like X-Plane, Silent Wings or FlightGear. It would be nice if you could see if the ATI/AMD kit is beginning to improve the Linux driver support since the two companies merged.
It would be nice if you could do a more in-depth report than "yes, it works and it looks nice", but even thats better than nothing :-)