AMD Radeon HD 4850 and 4870
Snatching the price:performance crown from Nvidia?
Review The new AMD 'RV770' graphics chip that lies at the heart of Radeon HD 4850 and 4870 owes a great deal to the 'RV670' that we saw in HD 3850 and 3870. It uses the same 55nm fabrication process and continues to support DirectX 10.1, but the transistor count has risen from 666m to 956m.
That’s a 44 per cent increase in the number of transistors so it’s impressive that the die area has only increased by 37 per cent from 190 to 260 square millimetres.
By the sound of it, AMD has taken baby steps with its new chip but one part of the specification leaps off the page: the number of Stream Processors - aka unified shaders - has jumped from 320 to 800.
AMD's RV770: die hard
These numbers puzzled is so we fired off an e-mail to AMD thus:
"How the heck has the shader count climbed from 320 in RV670 to 800 in RV770 on the same 55nm process? Do the old and the new shaders use the same number of transistors?"
We got this response from Richard Huddy, AMD's head of developer relations for graphics chips: "Yes, the RV770 is only about 40 per cent larger than RV670, but the transistors are used on average about 50 per cent more efficiently too – and when you combine both of those factors together you get a much more impressive design.
"Traditionally we’ve focused only on saving Silicon area by moving to smaller processes, this time we made a very deliberate decision to get our performance per dollar much higher.’
It sounds like AMD has adopted the Intel Tick-Tock approach. RV670 - Tick - was a major step forward from the 80nm Radeon HD 2900, and RV770 - Tock - has refined the design to deliver higher performance and better value.
What were you thinking when you made those graphs?
They tell you nothing, NOTHING I tell you!
And another thing...
Can't understand folks' bleating about the PowerPlay thing. There are so many folk giving it "ooh once again ATi screws us with the drivers, this is a joke, I'm sticking with nVidia at least the drivers work"
1) nVidia are just as bad if not worse for feature support in early drivers - look at the 7800GX2 fiasco.
2) If you want to spend more money on a card that will do the same job and consume more power then go ahead and quit whining
3) Er, wait a few weeks?
The only way I can see the "ZOMG the drivurz are skrewd and now I have to w8 a hole month" argument making any sense is if you're a spoilt kid buying into the latest architecture every time. Even then they don't come out often enough to make much sense out of such pissy bitchy whining.
PS: I'm not an ATi fanboy (until a few weeks ago I was poised to buy a 9800GTX) but I am a big fan of not having to filter out pissy whiny moans whilst reading comments on a mature IT news site.
Come on guys, everyone knows that Crysis is not an accurate Benchmark. It has been dropped from tests on a number of occasions.
Im waiting for anyone to do a fanless 4850 for xFire
Wow... a PS3 fanboi, hey?
I'm not even going to go into it, since so many people already did. I'd just be beating a dead ... vulture?
There is one situation, and one situation only, where Crossfire/SLI come into their own. Big displays. Really big displays. I'm running at 3840x1024, with two 7900GTXs (I'll upgrade before too long). In an ordinary machine, the difference between one card and two isn't worth the money. However, on a really big screen, having 1GB of memory instead of 512MB means something, and HL2: Episode 2 will run at full res on this machine, no worries. It sure as hell wouldn't with just the one card!
I would love to buy a 4870HD (or, as is far more likely, two), but this motherboard is all nVidia chipsets, so I'd need a new system core first.