During our testing, both of the AMD cards were commendably quiet, but the copper heatsink on the HD 4850 got very hot to the touch. The Catalyst drivers report a temperature of 80° Celsius. This won’t be a problem once the graphics card is sealed inside your PC but one edge of the heatsink runs along the top of the card in the perfect position to catch the impatient reviewer by surprise as he swaps graphics cards between tests.
It’s a bit like having a plate of really hot sausages fresh out of the oven. Common sense says you should give them a few minutes to cool but greed wins the day, and we now have the burns to prove the matter.
Moral: if you’re working with an HD 4850, give it enough time to cool down properly.
Radeon HD 4850: mind your fingers
The other major difference between the AMD and Nvidia cards is price. The GeForce GTX 280 is on its own somewhere in the stratosphere at £350, and GeForce GTX 260 is a good bit cheaper at £215-250. Compared to them, the new AMD cards are an absolute snip. The Radeon HD 4850 costs about £140 - which makes it comparable with GeForce 8800 GT and GTS - and the HD 4870 is, at £175, only slightly more expensive, which makes it comparable with 9800 GTX.
We tested a Gigabyte HD 4870 and a Sapphire HD 4850 on the same Intel Skulltrail system that we used for our Zotac GTX 280 review and also added an Asus HD 3850 X2, Asus HD 3850, Asus HD 3650, Sapphire HD 3450 PowerColor HD 2900 and Sapphire X1950 GT into the mix, just for fun.
In fact, we had two HD 4850s, one from Sapphire and one pre-production from AMD so we naturally ran them in CrossFire, but as you’ll see from our figures there was something a bit strange about the pre-production card. Performance was very good, but the power consumption was 50W higher than the retail Sapphire. You can take the Sapphire figures as gospel but the CrossFire figures are raw and need some adjustment. We estimate that two HD 4850s in CrossFire draw a similar amount of power as a single HD 4870 when the system is under load.
What were you thinking when you made those graphs?
They tell you nothing, NOTHING I tell you!
And another thing...
Can't understand folks' bleating about the PowerPlay thing. There are so many folk giving it "ooh once again ATi screws us with the drivers, this is a joke, I'm sticking with nVidia at least the drivers work"
1) nVidia are just as bad if not worse for feature support in early drivers - look at the 7800GX2 fiasco.
2) If you want to spend more money on a card that will do the same job and consume more power then go ahead and quit whining
3) Er, wait a few weeks?
The only way I can see the "ZOMG the drivurz are skrewd and now I have to w8 a hole month" argument making any sense is if you're a spoilt kid buying into the latest architecture every time. Even then they don't come out often enough to make much sense out of such pissy bitchy whining.
PS: I'm not an ATi fanboy (until a few weeks ago I was poised to buy a 9800GTX) but I am a big fan of not having to filter out pissy whiny moans whilst reading comments on a mature IT news site.
Come on guys, everyone knows that Crysis is not an accurate Benchmark. It has been dropped from tests on a number of occasions.
Im waiting for anyone to do a fanless 4850 for xFire
Wow... a PS3 fanboi, hey?
I'm not even going to go into it, since so many people already did. I'd just be beating a dead ... vulture?
There is one situation, and one situation only, where Crossfire/SLI come into their own. Big displays. Really big displays. I'm running at 3840x1024, with two 7900GTXs (I'll upgrade before too long). In an ordinary machine, the difference between one card and two isn't worth the money. However, on a really big screen, having 1GB of memory instead of 512MB means something, and HL2: Episode 2 will run at full res on this machine, no worries. It sure as hell wouldn't with just the one card!
I would love to buy a 4870HD (or, as is far more likely, two), but this motherboard is all nVidia chipsets, so I'd need a new system core first.