Original URL: https://www.theregister.com/2009/01/20/review_round_up_desktop_integrated_gpus/

Desktop integrated graphics shoot-out

The best IGPs from Intel, AMD and Nvidia slug it out

By Leo Waldock

Posted in Channel, 20th January 2009 13:02 GMT

Review There was a time, not so long ago, when integrated graphics were so feeble they couldn’t pull the metaphorical skin off a rice pudding. Broadly speaking, the integrated graphics processor (IGP) was fit for little more than the two-dimensional Windows desktop, and a graphics card was necessary if you wanted to play games.

intel vs amd integrated graphics

This separation between gamers and the general public worked remarkably well. The majority of PC users – the ones who don’t play games – could buy a cheap, quiet PC with a chipset from AMD, AlI/ULi, ATI, Intel, Nvidia, SiS or VIA without paying too much attention to the graphics, while gamers would buy a graphics card from ATI or Nvidia.

The chipset business went through a minor change when Nvidia bought AlI/ULI in December 2005 and a bigger change when AMD acquired ATI the following year.

Today, SiS has faded so far into the background that we weren’t even aware that it has a range of Core 2 chipsets. To the best of our knowledge, you can't buy an SiS motherboard in Western Europe. VIA has moved away from the desktop PC market to concentrate on its highly integrated Mini- and Nano-ITX motherboards.

That leaves AMD, Nvidia and Intel in the desktop chipset business, with AMD and Intel making chipsets for their own processors, and Nvidia sat in the middle catering for both makes of CPU.

Windows Vista was launched at the beginning of 2007, after a lengthy gestation period. The OS places significant demands on graphics hardware as the 3D Aero UI in the Premium and Ultimate versions requires a GPU that is compliant with DirectX's Shader Model 2.

That was a problem for Intel as its so-called Extreme Graphics couldn’t handle Aero. The chip maker was forced to raise its game, and it launched the G965 chipset, which introduced the GMA X3000 graphics core.

GPU-z: Intel and Nvdia on Intel

Intel and Nvidia on Intel
Click for full-size image

The other significant aspect of Vista is its whole-hearted support for DRM. This meant that Hollywood was prepared to allow its precious high-definition Blu-ray movies to run on Vista. The hefty amount of processing power required by Blu-ray's codecs will flog your CPU to death unless you help it out with a dedicated video decoding unit and the logical place for this is within the graphics chip.

The dedicated video decoding core, usually branded a Universal Video Decoder (UVD), is increasingly important for movie buffs and Media Centre fans as HD movie playback may be the sternest task that the PC faces. So a decent UVD will allow you to specify a processor that's slower, cooler and cheaper.

Digital output is essential and it's increasingly common to find motherboards with one VGA, one DVI, an HDMI and possibly a DisplayPort connector. Most graphics chips are perfectly capable of supporting two digital displays in any permutation, but there are very few motherboards that support more than one DVI connector.

Finally, we have gaming on integrated graphics which isn’t the contradiction in terms that you might think. During our testing, we found that it was surprisingly easy to play Far Cry on quite reasonable settings. Granted the original Far Cry dates back to 2004, but back in the day we found you needed a GeForce 6800 to make the game run properly.

Now that’s progress.

GPU-z: AMD and Nvdia on AMD

Nvidia and AMD on AMD
Click for full-size image

At present, the only chipset that supports the Core i7 is the Intel X58 so you can't use Intel’s new processor with integrated graphics. Motherboards for the Socket 939 Athlon 64 have completely vanished. This neatly divides the current crop of IGPs and chipsets into two groups. In the blue corner, we have the Intel and Nvidia chipsets that support LGA775 Core 2, and in the green corner we have AMD and Nvidia for Socket AM2+ Phenom.

We compared three Intel motherboards using a Core 2 Duo E8500 that runs at 3.16GHz at a cost of £150, while the two AMD chipsets were run on an Athlon X2 7750 that, at £65, is considerably cheaper. Both processors are dual core and, despite the name, the Athlon X2 7750 is actually a two-core Phenom so the memory support runs to 1066MHz DDR 2. The Intel motherboards also support DDR 2 rather than DDR 3.

The Athlon X2 7750 has a clock speed of 2.7GHz so we raised the speed to 2.8GHz on standard voltages to give it slightly more grunt without increasing the power draw or heat output.

3DMark06 Results

3DMark 06 Results

Longer bars are better

Power Draw Results

Power Draw Results

Power draw in Watts (W)

PCMark 05 Results

PCMark 05 Results

Longer bars are better

Far Cry 2 Results

Far Cry Results

Average frames per second
Longer bars are better

Core 2 on Intel G45 with GMA X4500 HD graphics
Example motherboard: Intel DG45ID (£95)

As you'll see from our test results, the Core 2 Duo E8500 systems idle at 50-55W and draw 75-85W under load. The AMD systems have a higher power draw of 80-100W at idle and 130-135W under load which would have been reduced by 20W if we had used a 2.5GHz Athlon X2 4850e. The downside is that CPU performance would have dropped by some 20 per cent which may be perfectly acceptable for Media Centre aficionados but it would have shown the AMD chipsets in a poor light as the Core 2 already had a significant lead in the performance stakes.

Intel DG45ID

Intel's DG45ID: overshadowed by its rivals?

Windows Vista forced Intel to raise its game with a succession of graphics cores that started with the GMA X3000 and then stepped through GMA X3100 and GMA X3500. Although these chipsets offered improvements both in terms of features and performance it's only with the GMA X4500 HD graphics core in the G45 chipset that Intel has finally provided an IGP that delivers the goods. The X4500 HD also supports the full range of HD video formats, thanks to what Intel calls Clear Video technology, so if you’re only looking at specifications the G45 chipset looks like a genuine contender.

In our tests, we found that the GMA X4500 HD delivered significantly lower gaming results than the other graphics chips which is no doubt due in part to the fact that Intel uses a mere ten unified shaders that run at a relatively slow 800MHz. By contrast, Nvidia runs its 16 shaders at 1500MHz, while AMD uses 40 shaders in its Radeon HD 3300 IGP core.

In addition, that we found that the quality of Blu-ray playback was affected by the BIOS release, which seems to be the result of hardware decoding being incorrectly disabled in some versions. The system performance of G45 is very good and the power draw is low with the result that the chipset and its passive cooler barely get warm to the touch. However, the graphics are a distinct weakness. You can buy a GeForce 9300 motherboard for the same price as a G45 model and that makes it difficult to recommend the Intel option.

Verdict

Once AMD stopped producing ATI chipsets for Core 2, we had no choice but to use Intel graphics. Nvidia has now stepped into the breach and in the process it has overshadowed Intel's G45 and GMA X4500 HD graphics.

Core 2 on Nvidia GeForce 9300, 9400 and Quadro FX 470 graphics
Example motherboards: Asus P5N7A-VM with GeForce 9300 (£89), Gigabyte GA-E7AUM-DS2H with GeForce 9400 (£117), Asus P5N-VM WS with Quadro FX 470 (£177)

Nvidia offers three graphics cores for the LGA775 Core 2 platform that share a common root. All three are fabricated on a 65nm process, and support DirectX 10.0 and Shader Model 4.0 with 16 unified shaders. The GeForce 9300 is the baby of the group, with a core speed of 450MHz and shaders that run at 1200MHz, while the top-end GeForce 9400 has clock speeds of 500MHz and 1500MHz, respectively. The other features of the two chips are identical.

Asus P5N7A-VM

Asus' P5N7A-VM: the natural choice for Core 2 CPUs?

We expected to find that the GeForce 9400 would draw more power than the 9300 but in fact the Zotac GeForce 9300 drew 5W more than the Gigabyte 9400 under load. It's quite possible that the balance was tipped by the active fan on the chipset of the Zotac board as the Gigabyte is passively cooled. The Nvidia chipset is a single chip unlike the northbridge/southbridge combo used by both Intel and AMD, so the cooling has to work fairly hard. We’re none too impressed by passive coolers on Nvidia chipsets and recommend that you use a case fan to keep the chipset below 60°C or 70°C.

The third IGP in the set is the Quadro FX 470, which is effectively identical to GeForce 9400. However, the drivers identify it as a workstation GPU. You’ll pay a hefty price for the Quadro FX 470 as the Asus P5N-VM WS motherboard sells for £177. So that’s a premium of £60 for the honour of having your graphics drivers and software recognise a Quadro rather than a GeForce.

Both GeForce 9300 and 9400 allowed us to play Far Cry at 1680 x 1050 with the image quality on High. That's impressive enough but we were also able to play Far Cry 2 moderately well with the quality settings turned to Low. Although PCMark05 shows that the graphics in GeForce 9400 have a significant advantage over GeForce 9300, we didn’t see this reflected in 3DMark06 or when playing games, so we favour the cheaper 9300 as it does the same job at a lower price.

Asus P5N7A-VM

Gets too warm for a media centre?

Performance in movie playback is very good and the GeForce 9300 trounces Intel G45 in a fight that is rather one-sided. The only reservation we have is that GeForce 9300 draws more power than we would like and as a result it gets a tad warmer than is ideal in a Media Centre.

Verdict

The GeForce 9300 gets the big thumbs up in comparison with Intel G45, making it the natural choice for anyone with a Core 2 processor.

 

AMD AM2+ Phenom on AMD 790GX with Radeon HD 3300 graphics
Example motherboards: MSI DKA790GX Platinum (£125), MSI KA790GX (£77)

AMD launched its superb 780G chipset early in 2008 with integrated Radeon HD 3200 graphics that employ 40 unified shaders. When the 790GX chipset was released later in the year, the graphics core was labelled Radeon HD 3300 but it looks very similar to the core of the 780G. The only obvious difference is a move from Hybrid graphics to Hybrid CrossFire X.

The 790GX chipset makes serious advances over 780G in other ways as the SB750 southbridge assists overclocking.

MSI DKA790GX

MSI's DKA790GX: impressive graphics?

The Radeon HD 3300 IGP beats the Nvidia GeForce 9400 by a narrow margin in 3DMark06 and loses by a similar amount in PCMark05. In our gaming tests we had mixed results. The HD 3300 core delivered the DirectX 9 goods in Far Cry and Far Cry 2 but lost out when we ran Far Cry 2 on DirectX 10.

The other area of concern is that the power draw was rather high, which can doubtless be blamed on the Athlon X2 processor. The Phenom family requires proper cooling if you want to avoid meltdown and the situation only gets worse if you choose an X3 or X4 model.

On the plus side, the AMD IGP’s video playback is absolutely superb, which makes the HD 3300 a natural candidate for a Media Centre PC. You'd be wise to choose a suitably low-powered processor, in which case overclocking is unlikely to be a serious consideration so you might as well choose a motherboard with the older, cheaper 780G chipset.

Verdict

The Radeon HD 3300 graphics in the 790GX chipset are impressive but they are let down by the toasty Phenom processor.

AMD AM2+ Phenom on Nvidia nForce 780a with GeForce 8400 GS graphics
Example motherboards: Foxconn Destroyer (£202), MSI K9N2 Diamond (£166)

The fourth option is to use Nvidia graphics on Phenom. This is provided by the nForce 780a chipset. The graphics core in question is the GeForce 8400 GS, which has a specification that's very similar to the GeForce 9300 but there is a fundamental difference. GeForce 9300 and 9400 motherboards have so far been Micro ATX designs with a single PCI Express graphics slot. The Foxconn Destroyer is a big, expensive motherboard that supports Tri-SLI so the integrated graphics are intended for use with Nvidia’s Hybrid SLI feature.

Foxconn Destroyer

Foxconn's Destroyer: slow, hot and expensive?

This is just as well as the GeForce 8400 GS was unimpressive in our tests and produced similar results to the Intel G45 in all of our benchmarks. The power draw of this system was, at a scary 100W, 20W higher than the AMD 790GX at idle. Under load, it hit 135W, 5W higher than the AMD part.

Video playback was very good and was indistinguishable from the Radeon HD 3300 and GeForce 9400, but this would be a bizarre motherboard to choose for movies unless you were wearing headphone to block out the noise from the cooling system.

To our jaundiced eyes, the nForce 780a looks like it was developed as a way of enabling Tri-SLI on Phenom and in that respect it succeeds. However it is slow, hot and expensive.

Foxconn Destroyer

No shortage of portage

Verdict

The natural role for integrated graphics on Socket AM2+ is, surely, inside a Media Centre that's small, cheap and quiet. The Foxconn Destroyer and nForce 78a deliver exactly the opposite. ®