Nvidia GeForce GTX 280
Over-priced, over-specced and over here
Review Nvidia has spent the past year waiting for AMD to give it a fight in the graphics sector. The G92 chip used in GeForce 8800 GT was little more than a die-shrink of the G80 that debuted in the original GeForce 8800 GTS and GTX.
The GeForce 9800 GTX used the same G92 chip and supported DirectX 10 with Shader Model 4.0 - just like the GeForce 8000 series - so it was hard to see why Nvidia felt the need to move from 8000 to 9000 numbering. More to the point, Nvidia decided to ignore Shader Model 4.1 and DirectX 10.1, which is part of Windows Vista SP1, so it really milked the G80/G92 architecture for all it was worth.
Zotac's GTX 280 AMP!: factory overclocked
Well, the time has come for a change: the launch of the new GT200 chip, which is used in this Zotac GeForce GTX 280 AMP! Edition as well as cheaper GeForce GTX 260 models. The GT200 is an awesome piece of silicon that packs in 240 Stream Processors - Nvidia's unified shaders - compared to the 128 in the G92 .
That change has raised the transistor count from 754m to 1.4bn, and as Nvidia has stuck with a 65nm fabrication process, the size of the GT200 has increased to an enormous 24mm² which is four times the size of a 65nm Intel Core 2 die.
Internally, the GT200 is divided up into ten clusters of 24 shaders and eight clusters of four ROPs with the core running at 602MHz, the Stream Processors at 1296MHz and the 1GB of GDDR 3 memory at a true speed of 1107MHz to give an effective speed of 2214MHz.
PC Gaming has gone mad.
Recently I put together a nice powerful Quad Q9450 system and 4Gb RAM, plus a not too shabby 8800GTS, the newer type with 512Mb and other changes from the 320Mb models.
Now imagine my surprise when I put Crysis on it, foolishly thinking that I could put the settings on at least High (not highest), as I watched my system grind to a painful halt. You could measure the frames per second in minutes.
I haven't built a PC system in about four years before this one after realising that I got into that flow of buying the biggest fastest systems. I stopped when it went all sour when the games required MORE than what the latest gfx cards could process.
So it seems things carred on getting sourer and sourer to the state of utter madness.
And now it seems even this ridiculously expensive state of the art card STILL doesn't allow crysis to run at high settings at high resolutions. How old is Crysis now? I dread to think how badly it performed when it first came out.
I'm not a console gamer generally, but something led me down the path of getting an XBox 360, then a PS3. I still cringe at some of the arcadey shallow-arsed titles available, but the new GTA4 seemed to justify my decision.
...Not that Crysis is a cerebral piece of gameplay in itself!
There is a clear 1:1 relationship when designing games for consoles, 1xConsole performance = 1xGame performance. Easy to achieve seeing as all consoles are created equal.
PC gaming has just gone insane due to the inherent anti-equilibrium (cool new word combo!). I'm surprised the whole industry hasn't crashed and burned due to this hardware-Software divide.
And what is it with those graphs? the GX2 and 8800GT (in different tests) seem to come out better all round. I'm presuming this is because GX2 is two boards in one package, but still isn't it cheaper than the 280?
So why is the 280 better then?
Or, for the price of a GTX280 you could buy an Xbox 360
... and have some decent games to play on it as well.
> Feel free to differ in opinion but if you do beware,
> your just plain wrong and likely stupid.
But may be able to spell ..
clarification about the test system
Yes the Crysis figures look weird and yes that probably says more about Crysis than it does about the GTX 280.
I tested with Windows Vista Ultimate Edition SP1 32-bit and Crysis was a fresh installation patched to v1.21
I'm a power loving gadget geek but even I don't care about this
because in order to use DX10 or higher I have to pollute my gaming system with Vista. Which means my 8800 still doesn't run at full capability because Windows refuses to make DX10 for XP Pro. Why the hell would I want to buy an even bigger card that I can't take advantage of because of poor policymaking for the software that drives it?
Hell, I'll just strap on another 8800 in SLI, still come out with decent power usage, and save an arseload of cash, perhaps even get some performance boost in framerate, even though particle effects and such are still going to be castrated.
And I don't wanna hear *jack* about DX10.1 unless it RUNS ON XP!