Tragically, the demo was posted a day after Asus had reclaimed its graphics card from our grasp after a very short loan period so we can't yet comment on the looks or performance of the demo on the Ultra. We can report, however, that the demo looked awful on AMD's ATI Radeon HD 2900 XT, so it's probably best if we pass on AMD's thoughts on the subject:
"Lost Planet is an Nvidia-sponsored title, and one that Nvidia has had a chance to look at and optimise their drivers for. The developer has not made [AMD] aware of this new benchmark, and as such the ATI Radeon driver team has not had the opportunity to explore how the benchmark uses our hardware and optimise in a similar fashion."
The reason for the short loan period is simple to explain. The 8800 Ultra is very expensive and samples are few and far between. At launch, Nvidia had just two cards available for the UK's entire array of technology correspondents, and we get the impression that Asus has only one EN8800 Ultra doing the rounds.
Asus' EN8800 Ultra graphics card
So what, you may wonder, do you get for an extra £100 over the price of a regular GeForce 8800 GTX? The core speed is raised from 575MHz to 612MHz, the Stream processors run at 1500MHz instead of 1350MHz, and the memory has an effective speed of 2160MHz rather than 1800MHz. In other words, the Ultra is a carefully selected GTX that runs approximately ten per cent faster than normal.
The other thing that sets the Ultra apart is the layout of the heatsink package. The GTS and GTX use a conventional double-slot design that transfers heat from the heatsink to the heat exchanger via an array of heatpipes. The cooling fan sits between the heat exchanger and the pair of PCI Express power connectors where it can suck air in from inside the case and then blow it through the heat exchanger. It's a logical arrangement. However, it means that the fan unit is located directly above the power hardware, and that's not an especially good location. With the Ultra the fan has been moved sideways where it's out of the way, hence the bulge in the plastic shroud which covers the entire length of the graphics card.
This simple change helps to keep the Ultra cool despite its maximum power draw of 175W. In 2D usage, the cooler is essentially silent but the real surprise is that the cooler barely gets any louder even when the Ultra is working hard. This is an impressive feat by any standards.
you know what the sad thing is?
Most people will buy it, just because the games on the market considers their otherwise fine card "obsolete" and disables a bunch of features that the card could otherwise handle just fine. That, plus forced drivers obsolence. For example - my current linux box runs of a 2000-ish GeForce4 MX440 AGP. NVidia has recently stopped supporting drivers for the board, but since the legacy driver runs fine on it, no probs. But then, what will happen when Kernel 2.8 comes out a few years down the road and it's API changed, and the legacy driver stops compiling?
I used to follow the "GPU Wars" religiously, ever since the Voodoo II hit the shelves, and was then challenged by the Riva TNT 128 (it's been so long, I'm not even sure I have the names right...)
But I lost interest after a while.
Firstly, the prices kept going up, and the release cycles shortened, to the point where the whole upgrade treadmill became rediculously expensive.
Secondly, the competition between ATI and Nvidia rapidly descended into self-parody.
And finally, the naming schemes got so complex I needed a look-up chart to know which chipset was faster than which, and by how much.
Is a GT faster or slower than a GTS? And a GTX? And what about the overlap with the previous generation? Is a 7600 faster or slower than a 6800? How does a 7800 GTSX5 Ultra Mega mk II rev B with Forceware 22.214.171.124C compare with the latest 8xxx?
Where's my lookup-table? Oops, it's out of date. They've release twelve new cards since it was printed, and they've also changes all the suffixes between generations. Now a GS means what MX meant last year, expect if we're talking DX9 performance, in which the MX is actually *better* than the GS, despite being older. Or something...
In the end, it sounds more and more like Scott Adams' idea of a "confuseopoly" - "a group of companies with similar products who intentionally confuse customers instead of competing on price". See http://en.wikipedia.org/wiki/Confusopoly (or http://en.wikipedia.org/wiki/The_Dilbert_Future)
Nice try NVIDIA, but my 8800GTX is clocked higher, and cost a lot less.
NiBiTor and NVflash FTW!