This article is more than 1 year old

Why Nvidia's Cg won't work

Opinion Andrew Richards, founder of Codeplay, the London based designer cutting-edge computer and console games development tools responds to yesterday's Nvidia Cg - 'C for graphics' - launch.

Nvidia produces capable graphics hardware which, since the introduction of the GeForce 3, contains a degree of programmability. It is programmable in the powerful but low-level pixel and vertex shader level, these being hardware-specific features of their products.

Nvidia now advocates a simplified cut-down language, known as Cg, which allows programming of those low level pixel and vertex shaders. It is promoting this as an open standard, being suitable to programme all 'GPUs' or graphics processor units.

However, not all GPUs are created equal, and differences will become ever greater. With the PlayStation 2, SCEI demonstrated a programmable graphics pipeline from higher up in the rendering process, which has different demands of a graphics programming language.

In the future, graphics hardware will incorporate both low level pixel and vertex shaders as demonstrated by Nvidia and higher-level general programmability as demonstrated in the PlayStation2. The Cg language is not sufficiently well specified for such hardware, particularly with reference to:

  • No break, continue, goto, switch, case, default. These are useful features that can be used without penalty on other vector processors.
  • No pointers. This is Cg's most serious omission. Pointers are necessary for storing scene graphs, so this will quickly become a serious omission for vector processors that can store and process the entire scene or even sections of it.
  • No integers. This may be appropriate to NVIDIA, but is not a universal design decision.
  • Arrays use float indices. This is an odd design decision, relevant to DirectX 8 and Nvidia only.

Nvidia has introduced Cg as a standard, fully backward- and forward-compatible. However, the existence of reserved keywords (such as break and continue, mentioned above) is a clear indication that functions will be added when Nvidia hardware supports it. This is not conducive to future compatibility.

Codeplay believes that Cg is inadequate for some current, and more future, GPUs. Most importantly, standard rendering code needs to move onto graphics processors, and Cg is not sufficiently flexible for this type of code.

Overall, Cg is not a generic language for graphics programming; it is a language for NVIDIA's current generation of graphics card. It has a different set of goals from what’s required to design computer graphics at the heart of the successful computer games of tomorrow.”

Instead, the right approach is to use standard C/C++, and to make the compiler sufficiently advanced to be able to produce high quality code for dedicated hardware. This facilitates programming of advanced hardware at both the low-level of shaders and the higher level of the rendering pipeline, and is the language familiar to millions of programmers.

Codeplay has demonstrated the implementation of a C/C++ compiler for a specialist graphics co-processor in their VectorC {VU} compiler for PlayStation 2. This approach will be extended to embrace emerging new hardware features without the need of proprietary ‘standards’.

Computer games are an intricate blend of immersive graphics and creative gameplay programming. With cross-platform standards and development tools that work as programmers demand, game developers can continue to produce radical-looking games beyond the vision of hardware engineers, thanks to creative talented software engineering. ®

More about

TIP US OFF

Send us news


Other stories you might like