Feeds

FutureMark: Nvidia didn't cheat

Benchmark maker pulls back from the brink

  • alert
  • submit to reddit

It's official: Nvidia didn't fix its 3DMark 03 figures after all. So says no less a source than 3DMark 03 developer FutureMark, in a jointly issued press release put out this morning.

It appears the two companies had "detailed discussions" following FutureMark's announcement last month that its investigation into Nvidia's driver software had revealed that the graphics chip company had tailored its code to generate higher frame-rates in 3DMark 03 tests at the expense of image quality.

FutureMark discovered eight instances of cheating, which improved the performance of Nvidia's Detonator FX and WHQL drivers by as much as 24.1 per cent.

The investigation followed tests carried out by ExtremeTech which highlighted the degradation of the visuals observed when running 3DMark 03 on Nvidia hardware.

FutureMark said: "The cheating described [in its Nvidia audit] is totally different from optimisation. Optimising the driver code to increase efficiency is a technique often used to enhance game performance and carries greater legitimacy, since the rendered image is exactly what the developer intended."

Now, though, it seems FutureMark was completely wrong. "FutureMark now has a deeper understanding of the situation and Nvidia's optimisation strategy," it says. "In the light of this, FutureMark now states that Nvidia's driver design is an application specific optimisation and not a cheat."

FutureMark says it now appreciates that different graphics chips have different architectures and their own, unique optimal code paths, and that just as takes these into account with its CPU benchmarks, so it needs to do so with graphics tests too. In this case, that largely means working with multiple, different shader precisions. ATI's top-end chip works at 24-bit floating point, but Nvidia's can switch between 32-bit and 16-bit floating point, and 12-bit integer. As Doom creator John Carmack recently wrote, "there isn't actually a mode where [the two architectures] can be exactly compared."

In short, it intends to consider building tests that can take all the advantages of whatever graphics chip they're running on. That way neither Nvidia nor ATI will need to substitute benchmark shader code with their own, via their driver software, in order to adjust the test for their respective architectures.

That's counter to FutureMark's original approach of producing a generic, un-optimised test and seeing how well each processor handles it.

The new approach should yield better, more 'real world' results. If a game developer can implement different rendering code streams within their applications to take advantage of different graphics chips, why can't benchmark coders?

The downside is that every time ATI or Nvidia spots a new optimisation and implements it in a new driver release, FutureMark is going to have to update its software: the benchmarks will become a constantly moving target, making chip-to-chip comparisons less valuable.

And it will leave FutureMark open to future claims of bias toward one vendor or another. If FutureMark implements ATI's latest architecture-specific optimisations but Nvidia's didn't arrive in time to make the cut, Nvidia is going to yell 'foul' if its products don't score so well. The situation will resemble the ongoing accusations of bias levelled at processor benchmark makers by AMD and Intel.

FutureMark hasn't committed itself to this alternative approach. As its says: "FutureMark will consider whether this approach... where manufacturer-specific code path optimisation is directly in the code source... is needed in its future benchmarks." (our italics)

That, however, may be a face-saving move granted to a company that has already tacitly admitted it can't afford to antagonise an operation like Nvidia (see 3DMark Producer Patric Ojala's comments on Beyond3D, here). FutureMark may have to adopt such a methodology if it wants to avoid conflict in future.

Nvidia does seem to have moved away from its accusation that 3DMark 03 was deliberately biased against its products. So both company's lawyers seem to have won concessions from the others'. Whatever, Nvidia may now re-join FutureMark's beta programme, which it quit earlier this year.

In the meantime, its statement will be unlikely to impress users who've seen screenshots showing the effects of Nvidia's 'optimsed' shaders. ®

Related Stories

What's wrong with this pixel shader?
ATI admits it 'optimised' drivers for 3DMark 03

Related Link

ExtremeTech's coverage of benchmark cheating

Whitepapers

A new approach to endpoint data protection
What is the best way to ensure comprehensive visibility, management, and control of information on both company-owned and employee-owned devices?
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Maximize storage efficiency across the enterprise
The HP StoreOnce backup solution offers highly flexible, centrally managed, and highly efficient data protection for any enterprise.
How modern custom applications can spur business growth
Learn how to create, deploy and manage custom applications without consuming or expanding the need for scarce, expensive IT resources.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.