Nvidia GeForce 7800 GTX
More evolution the revolution?
Before I discuss the hardware used, a little on the latest ForceWare driver. The 64-bit version of the driver is now on a par with the 32-bit build, performance of TurboCache boards is up and SLi's profile system has had some time spent on it. More applications and games are now supported by the driver, for both GeForce and Quadro parts, and the user now has more control over what SLi rendering mode is used (when a choice is actually available). For video users there's better support for HDTVs and HDTV output resolutions (1080i in particular, apparently) and Windows Media Center gets new driver-supported extensions for controlling your hardware.
DFI's simply superb LanParty UT nF4 SLI-D was a willing host for everything, Corsair's flashy stuff assmastered the memory bus and kept everything fed with data and I kept the CPU cool with an Akasa Evo33.
Comparing the 7800 GTX to the 6800 Ultra means that, yes, I missed out a comparison to ATI's latest and greatest. X850 XT PE is a wee bit faster than a single 6800 Ultra overall, so use your imagination to guess where performance would be, relative to the boards on test.
So, new GPU powered by fast CPU. There wasn't a single lockup, performance glitch or otherwise serious issue during testing.
Nvidia's transparency anti-aliasing uses sub-pixel samples to anti-alias alpha-textures. Nvidia supports it in G70 using both multi-sampling (depth samples) and super-sampling (texture samples) using alpha information attached to the texture. Here's how it looks. Firstly, without transparency AA on an alpha-textured chain-link fence (everyone will use chain-link fences, so be warned!). All images are clickable for lossless PNG versions.
Notice the significant texture aliasing visible on the links. Turn on super-sampling transparency AA and most of the aliasing artefacts disappear.
The difference output of the two images shows you what parts of the image the super-sampling is working on. You'll need to click the image to see it properly, resizing the image loses the difference detail.
To measure the performance hit, I recorded a short demo in Half-Life 2 inside the prison section of Nova Prospekt, where there's more chain-link fencing (and hence alpha textures) than you know what to do with. Benchmarking the demo showed the performance hit.
The performance difference is about 12 per cent over the demo, with 10-15 per cent Nvidia's claimed average performance hit.
Following ATI's announcement at Computex that unannounced hardware was going to accelerate the decode of H.264 video, a format more commonly known as MPEG 4 AVC, Nvidia has been keen to say it will have support for H.264 sometime in 2005, on all their hardware that has working PureVideo silicon. Playing back 1080p content using the FX and 7800 GTX test platform shows around 45-55 per cent CPU usage. The GPU's doing something, but not something a 6600 GT can't do, for example.
It appears that Nvidia hasn't spent much, if any, of its transistor budget for G70 on silicon used just to process video. H.264 support seems to be something that'll be accelerated by fragment programs on G70 and other Nvidia hardware, rather than by dedicated decode hardware like ATI appear to possess. With 3D speed ever increasing, massively powerful CPUs like the 2800MHz FX becoming a limitation to new single boards, never mind SLi, image quality and video processing increases are what's needed next.
Video quality appears unchanged compared to NV43 and the other Nvidia GPUs with a fixed video processor, which is slightly disappointing.
With H.264, especially since it's the native format for Sony's PSP hand-held gaming console and a video format for both HD-DVD and Blu-ray Disc, about to become the most dominant video format in common use, spending some time with the GPU in terms of decode outside of the fragment hardware seems like a prudent thing to do. We'll see.
Next page: Benchmarks