Original URL: http://www.theregister.co.uk/2006/11/08/review_nvidia_geforce_8800_gtx/

Nvidia GeForce 8800 GTX graphics card

GeForce 8800 GTX = 2 x Radeon X1950 XTX

By Leo Waldock

Posted in Hardware, 8th November 2006 19:02 GMT

Review Are you ready for DirectX 10 gaming under Windows Vista? No? Didn't think so. Today Nvidia finally unveiled its new DX10-capable graphics chip, the GeForce 8800 - aka 'G80' - but is it worth forking out for one now, or should you wait until Windows Vista ships? Read on to find out how good the GeForce 8800 GTX really is...

Sparkle_GeForce_8800GTX

The first batch of GeForce 8800 GTX graphics cards all come out of the same Taiwanese factory and use reference clock speeds so there's very little to distinguish one 8800 GTX board from another, but Sparkle, which kindly supplied Reg Hardware with our 8800 GTX review sample, has added some neat branding to the cooler while the packaging is positively under stated. In addition to the graphics card you get Call of Duty 2, Cyberlink PowerDVD 6 (stereo version), an s-video extension cable, a breakout cable that offers s-video outpur and component-video connections, two DVI-to-VGA adaptors, and not one, but two six-pin power cables.

Only you won't notice any of those things as you will be immediately and utterly captivated by the enormous graphics card.

It's a double-slot design that measures 26.5cm from the inside of the bracket to the end of the PCB. To put that in context, the sizeable AMD ATI Radeon X1950 XTX measures 22.8cm in length.

The GeForce 8800 GTX chip is fabricated by TSMC on a 90nm process and uses 681m transistors in its design, which is a huge increase from previous designs. The 384-bit memory controller connects to 768MB of DDR memory that runs at 900MHz to give an effective speed of 1,800MHz and a memory bandwidth of 86.4GBps, but after that things start to get a bit complicated.

GeForce 8800 GTX supports DirectX 10 Shader Model 4.0 graphics and, as we predicted Nvidia Quantum Effects technology.

Nvidia has ditched discrete pixel shaders and vertex shaders to move instead to a unified shader architecture, which it refers to as Stream Processors. DirectX 10 introduces Shader Model 4.0 and vastly increases the number of registers and textures shaders need to be able to cope with, but in addition it introduces us to geometry shaders. If you extend the architecture of GeForce 7900 and Radeon X1950 to include dedicated pixel, vertex and geometry shaders then you can guarantee that many chunks of the GPU will be lying idle at any given moment. With unified shaders - sorry, Stream Processors - it's possible to make more of the GPU work for its living. In time, DirectX 10 will also encompass physics processing as part of the workload for unified shaders, but for the time being Nvidia has enabled this feature with its Quantum Effects physics engine.

Sparkle_GeForce_8800GTX_box

The core of the die - the dispatch and texture units and 24 ROPs - runs at 575MHz, which is the speed that Coolbits reports. However, the 128 Stream processors run at a lofty 1,350MHz.

I mentioned that you get two PCI Express power adaptors in the box, and this is where the downside of that huge transistor count bites back because the Sparkle and other 8800 GTX cards demand 140W of power. Nvidia recommends that you run your GeForce 8800 GTX with a 450W or greater PSU, which sounds incredibly optimistic to us.

In addition to the two power connectors you also get two SLI connectors, but we have yet to see these in action - only one review card arrived, alas. We plan to bring you a follow-up on SLI performance in the very near future.

The two DVI outputs are dual-link and support dual 2,048 x 1,536 at 85Hz displays or a TFT up to 2,560 x 1,600, while the seven-pin HDTV mini-DIN connector has been revised from previous versions.

To quote the Nvidia reviewer's guide, "the sweet-spot for Extreme High Definition resolution for a GeForce 8800 GTX configuration is 2,560 x 1,600 on a 30in LCD panel", which is rather like saying that Scarlett Johansson would make a good dinner date. Of course she would, but there are one or two practical obstacles to the proposition.

Then there's the question of a test platform. Naturally we wanted to use the Intel Core 2 Extreme QX6700 processor but that limited us to an Intel chipset and we're forever hearing from Nvidia that Intel can't do PCI Express to save its life. Turning to the reviewer's guide once again we are told to "please use an nForce 680i motherboard" - an unreleased product as yet - so it's a shame that the hardware isn't available.

GeForce_8800GTX_3DMark06

In the end I opted to run the Core 2 Extreme QX6700 on an Intel D975XBX2 motherboard with 2GB of Corsair XMS2 memory and a WD150 Raptor hard drive running Windows XP SP2, which means I'll have to build an Nvidia test rig to give GeForce 8800 GTX SLI a run in a week or two.

Anyway, a far bigger consideration is that we can only give this DirectX 10 graphics card a severe work-out under DirectX 9 and Windows XP, as we don't yet have any DirectX 10 software to run on the release candidate of Windows Vista. Not only are we testing a 2007 graphics card with 2006 software, but there is the huge unknown of the impact of DirectX 10, which is supposed to shift a large part of the general graphics workload from your CPU and dump it on the GPU.

Once the 8800 GTX was running, the first impression is that the enormous cooler works superbly well and is incredibly quiet. The fan is 72mm in diameter and stands some 20mm tall, so there's plenty of blade area. Throughout our testing the temperature didn't budge above 65° and the noise level was utterly negligible.

Our second observation is that the drivers - version 96.89 in the box - support both High Dynamic Range (HDR) and anti-aliasing (AA) at the same time on the 8800 GTX, and the AA settings go all the way up to 16x Q, thanks to the new Lumenex engine. Clearly Nvidia expects great things from the GeForce 8800 GTX so it was time to see it in action.

Running 3DMark06 on default settings we got 11,606 marks which only dropped to 9,080 marks when we enabled 4x AA and 16x anisotropic filtering (AF). Overclocking the quad-core Core 2 Extreme to 2.92GHz made no difference to 3DMark06. More surprisingly, when we overclocked the GeForce 8800 GTX to 626MHz/1,900MHz the scores barely changed.

With the Core 2 Extreme running at 2.92GHz with a pair of Sapphire Radeon 1950 XTX cards in CrossFire at 648MHz/1,000MHz we got a 3DMark06 score of 11,954 marks which is essentially the same as the GeForce 8800 GTX at the stock CPU speed.

It was time to play some games. We started by running Far Cry but these days it's not much of a test for high-end hardware, so we switched to Elder Scrolls: Oblivion which seems to favour ATI graphics.

GeForce_8800GTX_FEAR

We ran FRAPS while we ran across a hillside in sunlight at Ultra High Quality with HDR enabled at a resolution of 1,600 x 1,200. As we ran we looked up at the sun a couple of times and then dived into a lake. It's a horrible test that would make most gaming rigs cry for mercy but the Core 2 Extreme and the 8800 GTX set-up handled it incredibly well.

With the Core 2 Extreme running at 2.92GHz, a single Sapphire 1950 XTX got an average frame rate of 52.7fps. Adding a second 1950 XTX in CrossFire mode pushed the average to 63.9fps, and we have to admit that the ATI hardware impressed us. Switching to the Sparkle 8800 GTX, the average frame rate climbed to a stunning 73.7fps, so we cranked the resolution to the maximum setting of 1,920 x 1,440 and saw the speed drop a tiny amount, to 73.3fps - 0.4fps difference.

No doubt Nvidia will unlock even more performance over the coming months with driver revisions, but no matter how you look at it the GeForce 8800 GTX is a stunning piece of hardware. The only shame is that so few people will be able to afford the huge selling price, a whopping £475.

Verdict

We know that only a handful of enthusiasts will spend £25 shy of £500 on a graphics card these days but that shouldn't detract from the power of Nvidia's GeForce 8800 GTX. It's deeply impressive and we can't wait for this new technology to trickle down to a more mass-market price point. ®