We have to ask ourselves why Nvidia launched the Ultra when it already ruled the roost with the 8800 GTX. The obvious answer is that it had to prepare for the launch of the Radeon HD X2900 XT. However, the performance of the X2900 XT fell a long way short of the 8800 GTX, so there was arguably no need for the Ultra after all.
Elder Scrolls Oblivion Results
Frames per second (FPS) - longer bars are better
Indeed, there's some evidence that the Ultra was a bit rushed as driver version 158.22 isn't WHQL approved and Nvidia's nTune v5.05 overclocking utility doesn't recognise the Ultra.
No doubt Nvidia enjoyed the 'Fastest Ever' headlines that rained on AMD's parade but it must be earning a small fortune from the sale of every Ultra chip, and we suspect that was the real motivation.
Asus' EN8800 Ultra: big box, small bundle
The Asus part of the package is minimal and consists of a copy of Stalker Shadow of Chernobyl, which is nice but less than £20 retail. You also get an s-to-component video splitter cable, one DVI-to-VGA adapter and two PCIe power adapters. Please, if you're spending this much cash on an Ultra, go get yourself a proper power supply.
The 8800 Ultra takes over from the 8800 GTX as the fastest graphics card on the market but the price is truly terrifying. No doubt there will be a handful of gamers who are unable to resist squeezing an extra few frames per second onto their huge LCD TV at any price, and we applaud them wholeheartedly. No, that's quite insincere - in truth we're as jealous as heck...
Asus EN8800 Nvidia GeForce 8800 Ultra-based graphics card
you know what the sad thing is?
Most people will buy it, just because the games on the market considers their otherwise fine card "obsolete" and disables a bunch of features that the card could otherwise handle just fine. That, plus forced drivers obsolence. For example - my current linux box runs of a 2000-ish GeForce4 MX440 AGP. NVidia has recently stopped supporting drivers for the board, but since the legacy driver runs fine on it, no probs. But then, what will happen when Kernel 2.8 comes out a few years down the road and it's API changed, and the legacy driver stops compiling?
I used to follow the "GPU Wars" religiously, ever since the Voodoo II hit the shelves, and was then challenged by the Riva TNT 128 (it's been so long, I'm not even sure I have the names right...)
But I lost interest after a while.
Firstly, the prices kept going up, and the release cycles shortened, to the point where the whole upgrade treadmill became rediculously expensive.
Secondly, the competition between ATI and Nvidia rapidly descended into self-parody.
And finally, the naming schemes got so complex I needed a look-up chart to know which chipset was faster than which, and by how much.
Is a GT faster or slower than a GTS? And a GTX? And what about the overlap with the previous generation? Is a 7600 faster or slower than a 6800? How does a 7800 GTSX5 Ultra Mega mk II rev B with Forceware 220.127.116.11C compare with the latest 8xxx?
Where's my lookup-table? Oops, it's out of date. They've release twelve new cards since it was printed, and they've also changes all the suffixes between generations. Now a GS means what MX meant last year, expect if we're talking DX9 performance, in which the MX is actually *better* than the GS, despite being older. Or something...
In the end, it sounds more and more like Scott Adams' idea of a "confuseopoly" - "a group of companies with similar products who intentionally confuse customers instead of competing on price". See http://en.wikipedia.org/wiki/Confusopoly (or http://en.wikipedia.org/wiki/The_Dilbert_Future)
Nice try NVIDIA, but my 8800GTX is clocked higher, and cost a lot less.
NiBiTor and NVflash FTW!