Feeds

Nvidia analysis ‘shockingly ignorant’

Shader business

  • alert
  • submit to reddit

Intelligent flash storage arrays

Letter Re: Why Nvidia cg won't work

I must call foul on the rather hasty and reactionary article you posted in response to the Nvidia Cg Shader launch. Andrew Richards predicts that Cg will fail because a C++ compiler for a PS2 chip is better. This is a very strange claim, considering that the PS2 processor at best supports geometry transformation and has access to the host memory on a platform that is essentially irrelevant to most graphics systems.

Would you predict the failure of Direct3D because it isn't on PS2? Or OpenGL because it merely has poor 3rd party support on PS2. Fundamentally the PS2 is a strange beast with quite primitive shading capabilities.

It's not so difficult to understand the Richards' response when it becomes increasingly clear that he doesn't draw any distinction between pixel 'fragment' shading and vertex shading, nor does he seem to understand the requirements for shading efficiently on a highly scalable stream processor rather than writing War and Peace on a more general purpose processor.

It's one reason for the existence of the shader compiler, because the divergent hardware (and interfaces) makes it increasingly difficult to design and implement consistent more sophisticated hand-coded shaders that look visually similar on different platforms or graphics cards.

The author claimed that divergent hardware will somehow defeat this shader approach. However, divergent hardware is one reason we need shader compilers.

The kind of approach being advocated by the author would clearly eliminate almost all graphics hardware as potential candidates, with his ridiculous assertions that pointer support being essential. (Some languages manage just fine without pointers).

This is just shockingly ignorant when it comes to hardware implementations and the nature of the problem in terms of memory access and again host driven streamed data from memory.

(The graphics implementation is fundamentally a scalable streaming one. There is not the general purpose capability to run arbitrary code of unlimited complexity and make any old fetch from system memory. There is a distinction to be drawn between the needs of generating and supplying application data, and rendering that data streamed to a graphics processor.)

In addition to this research papers have demonstrated how dependent texture fetches can be used in fragment shaders (pixel shaders) to address memory held in texture data if one insists on this type of access pattern.

It is important to understand that on CURRENT hardware - as in all PC graphics cards and even next generation PC cards - the Cg shader can use pixel operations as a low level instruction set. The key here is that the instruction set is a separate template with the Nvidia Cg software which defines how Cg compiles the shader. Schemes like pointers and languages of arbitrary complexity would be woefully unsupportable.

On the issue of floats vs ints I can use floats instead of ints in many cases, there are limited examples of this currently in OpenGL. Many things with internal integer representations are represented by floats, this infact abstracts the hardware to a degree. Things like integer pixel values aren't explicitly exposed if indeed you even have integer representations. I might lose my max int value, but we're not writing War & Peace in a vertex program. The 'integer' keyword has a placeholder in the spec, it's simply currently unsupported, probably because it's just not required yet, although maybe some day it will be. By the same token loops and branches are enabled by the target hardware profile, indicating a clear intent to support these developments when they become practical on interesting hardware.

The future of Cg is not assured, there are alternatives emerging on relevant platforms, and I don't number PS2 among the relevant platforms in this debate. Cg will most certainly not fail because of an obscure C++ compiler for a PS2 chip.

Finally, the assertion was that Cg will fail. Fail at what? World domination? Very probably! Fail to gain significant developer mindshare? Possibly, but it's wide open and although Nvidia is the 800 lb gorilla of graphics, they also have the most interesting and innovative hardware currently on the way.

So it must have a reasonable shot at this one. Half the battle is having the courage of your convictions and worse than doing nothing, is failing to deliver interesting capabilities on Nvidia's graphics platforms. With Microsoft's endorsement and serious tool vendors like Alias|Wavefront and Discreet getting involved - Cg can only help Nvidia.

There are many markets Nvidia might hope to chase with this technology, not just games but developers for whom Cg functionality has long been the holy grail. There is an entire industry out there that has managed for years with shader functionality that can easily be implemented within the Cg framework. Now that Nvidia is on the verge of delivering this capability in hardware I doubt those who have been begging for this will hesitate to use it for the reasons suggested. And in terms of hardware quality, Nvidia is shaping up to have the best of breed. We shall see.

Sincerely,
Corky Closure

Internet Security Threat Report 2014

More from The Register

next story
Boffins who stare at goats: I do believe they’re SHRINKING
Alpine chamois being squashed by global warming
Not a loyal follower of @BritishMonarchy? You missed The QUEEN*'s first Tweet
Her Maj opens 'Information Age' at the Science Museum
Space exploration is just so lame. NEW APPS are mankind's future
We feel obliged to point out the headline statement is total, utter cobblers
Down-under record: Australian gets $140k for pussy
'Tiffany' closes deal - 'it's more common to offer your wife', says agent
Internet finally ready to replace answering machine cassette tape
It's a simple message and I'm leaving out the whistles and bells
FedEx helps deliver THOUSANDS of spam messages DIRECT to its Blighty customers
Don't worry Wilson, I'll do all the paddling. You just hang on
The iPAD launch BEFORE it happened: SPECULATIVE GUFF ahead of actual event
Nerve-shattering run-up to the pre-planned known event
Win a year’s supply of chocolate (no tech knowledge required)
Over £200 worth of the good stuff up for grabs
STONER SHEEP get the MUNCHIES after feasting on £4k worth of cannabis plants
Baaaaaa! Fanny's Farm's woolly flock is high, maaaaaan
Adorkable overshare of words like photobomb in this year's dictionaries
And hipsters are finally defined as self-loathing. Sort of
prev story

Whitepapers

Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
New hybrid storage solutions
Tackling data challenges through emerging hybrid storage solutions that enable optimum database performance whilst managing costs and increasingly large data stores.