Original URL: http://www.theregister.co.uk/2007/04/17/review_msi_nx8600gts/

MSI NX8600GTS graphics card

Surname T2D256E-HD-OC, in case you were wondering

By Leo Waldock

Posted in Hardware, 17th April 2007 13:13 GMT

Review Nvidia stole a march on ATi when it launched the DirectX 10 GeForce 8800 GTS and GTX chips back in October 2006. Even though the G80/8800 was a powerhouse of a chip we wondered who exactly would spend the thick end of £400 on a single graphics card or £800 on a pair of the things in SLI.

MSI NX8600GTS-T2D256E-HD-OC graphics card

While the performance of 8800 was little short of epic, what we really wanted to see were the mid-range versions of the 8800 that Nvidia would inevitably release in time. And now, finally, that time has arrived and Nvidia has launched the 8600GTS, 8600GT, and 8500GT with the promise of an 8300GT some time in the future.

We're looking at the MSI NX8600GTS-T2D256E-HD-OC, which is a pre-overclocked 8600GTS, however, the overclock is quite modest. More to the point, we have yet to see a regular 8600GTS - and judging by the phone calls we've received over the past few days, there seem to be plenty of these faster GTS models doing the rounds.

The MSI is a double slot card with a bulky cooler that uses a heatpipe to move heat to a finned cooler that is positioned between the heatsink and the bracket. The fan blows cooling air through a duct and across the cooler, but unusually the hot air isn't expelled directly out through the bracket as there is a 25mm gap between the finned heat exchanger and the vented bracket.

Although this arrangement keeps the 8600GT chip cool, it is rather noisy and is more reminiscent of a rackety Radeon X850 than a sophisticated GeForce 7800GTX.

MSI NX8600GTS-T2D256E-HD-OC graphics card

You get a conventional package that consists of the graphics card, a power adapter cable, two DVI adapters, a splitter cable with Component and S-Video outputs, and an S-Video extension cable. The software package includes Forceware v101.02 for both Windows XP and Vista plus MSI Live!, Live Update 3 and a few other bits and pieces. MSI is rather pleased with its Dual Core Cell utility, which monitors the graphics card in Windows Vista. At the time of writing the utility was not included on the CD or MSI Live Update, so we downloaded it from MSI's 8800GTX web page and it did a decent enough job.

The 8600GTS chip uses 289 million transistors, which is less than half the 691 million found in a full fat 8800. It has a reference core speed of 675MHz and 1GHz DDR to give an effective speed of 2GHz, however this MSI has a core speed of 700MHz and 256MB of GDDR-3 that runs at 1050MHz for an effective speed of 2.1GHz. As we said, it's a small overclock. We can't be sure, but we reckon the stream processors run at or near 1.45GHz and we also understand that Nvidia has moved to an 80nm fabrication process.

There are 32 stream processors or unified shaders, compared to the 128 found in 8800GTX and 96 in an 8800GTS, and the memory controller has been cleaved down to 128-bit. Although the 8600GTS core, memory and shaders all run faster than those in the 8800, if you take all those numbers together and add in the maximum power rating of 71W, it suggests that the 8600GTS will have somewhere between one quarter and one half of the performance that we saw from the 8800 models.

We tested the MSI on an Asus M2A-VM motherboard with AMD 690 chipset using an Athlon 64 X2 5000+ processor, 2GB of Kingston KHX6400 memory and a WD Raptor 150GB hard drive. For comparison we used an Asus 7950GT which is priced at £165 inc VAT and noting reader comments we ran both set-ups in Windows XP Pro and Windows Vista Ultimate. This meant we had to use a variety of drivers as the 8600GT uses Forceware 101.02 in XP and Vista while the 7950GT runs on v93.71 in XP and v100.65 in Vista and we noted that Forceware 101.02 in XP is non-WHQL.

Flipping to the test results we didn't run Far Cry in Vista as the testing utility that we use to run a time demo doesn't work under Vista. The game itself played just fine and by the time we got to that stage in the proceedings it was clear that the performance of the two graphics cards was fundamentally identical. Performance in Vista was a fraction slower than XP in 3DMark06 while FEAR XP and Half Life Lost Coast were both faster.

Far Cry Benchmark Results
Far Cry Benchmark Results


Bigger bars are better

F.E.A.R. Benchmark Results
F.E.A.R. Benchmark Results


Bigger bars are better

Half-Life 2 Benchmark Results
Half-Life 2 Benchmark Results


Bigger bars are better

We've skirted around one enormous area of difference which is DirectX 10. The 8600GT supports DirectX 10 while the 7950GT is DirectX 9.0c and Shader Model 3 compliant. That should give the nod to the 8600GTS but we haven't yet seen a single piece of DirectX 10 software, unless you count Windows Vista, so we're not going to fall into that particular trap. Instead, let's just say that the 8600GT ought to be reasonably future-proof.

Verdict

For the time being we have little choice but to ignore DirectX 10 and that means there is currently no compelling reason to upgrade from a GeForce 6600 or 7600 to an 8600. If we found ourselves forced to buy a graphics card in the next few weeks it would be a bit daft to miss out on the opportunity of playing Halo 2, so what would we do? Hmm, it's a tricky one, but there would be a strong temptation to plump for a GeForce 8800GT with 320MB of memory at £215.