Nvidia boss: Intel suit to 'transform computer industry'
And GPUs rock
Nvidia CEO Jen Hsun Huang believes the US Federal Trade Commission's lawsuit against Intel could "completely transform the computer industry."
On Wednesday, the FTC sued the world's largest chip maker over alleged anticompetitive practices. Among other things, the consumer watchdog accused Intel of illegally attempting to smother the makers of rival graphic chips.
"These products have lessened the need for CPUs, and therefore pose a threat to Intel’s monopoly power," the complaint reads. "Intel has responded to this competitive challenge by embarking on a similar anticompetitive strategy, which aims to preserve its CPU monopoly by smothering potential competition from GPU chips such as those made by Nvidia."
After the release of the complaint, Jen Hsan Huang addressed the suit in an internal Nvidia memo, shared with Cnet. "This is an action the industry needs and one that consumers deserve. And it's one that can completely transform the computer industry," he writes. "Intel is fully aware that great graphics have become one of the most important features for consumer PCs, the fastest-growing segment of the PC market. Even more alarming to Intel is the revolutionary parallel computing technology in our GPUs that is being adopted by software developers across the world.
"The more successful we became, the bigger threat we were to Intel's monopoly. Instead of creating competitive GPU solutions and competing on the merits of their products, Intel has resorted to unlawful acts to stop us."
What's more, Huang sees the suit as a ringing endorsement for the graphics-chip industry - Intel aside. "Today's FTC announcement highlights the industry-changing impact of the GPU and the importance of our work," he says. "Our innovation is making the PC magical and amazing again. I can now imagine the day when Intel can no longer block consumers from enjoying our creation and experience computing in a way we know is possible." ®
== batshit crazy. Continue to take anything he says with a truckload of NaCl. The only people this may affect is AMD, and even that could go both ways. Until JHH sorts out Fermi, this constant, pitiful, desperate practice of renaming existing cards as new ones and, frankly, shuts the fuck up and stops making an arse of himself, nV will continue to be a laughing stock.
Intel has just announced that they will be modifying the Atom to incorporate the GPU... Hmmmm... me thinks I hear the muffled cries of nvidia and ATI as they get smothered by a pillow with Intel embroidered on it.
"lessened the need for CPUs" - bollocks
When I look at the latest Steam Hardware Survey data (http://store.steampowered.com/hwsurvey), I see three things that fly in the face of these words :
1) Quad-CPU adoption has increased by 13.3% over the last 18 months
2) Multi-core CPUs now account for over three-quarters of the survey base
3) Multi-GPU systems still account for less than 2% of the survey base
The way I read this situation is that GPUs have had next to no impact on multi-core CPU progression. I wonder why ? Could it be because GPU drivers have long been less efficient on SLI configurations than on single-card configs ? Could it be because a host of problems often plague SLI configurations while single-GPU boxes just game on ? Could physics end up having only an incidental effect on gaming ? Finally, could it simply be the prohibitive price of multi-GPU setups ?
As a side note, about physics : has anyone else noticed that the most visible use of the tech is to add a plethora of additional particles to explosions ? Is there anyone else that finds such stupid use of that tech as annoying as I do ?
Mr. Huang, you are already practically guaranteed one sale per PC. Just because you want to have two more sales per PC doesn't mean you deserve it. You say "great graphics have become one of the most important features for consumer PCs", and I totally agree with that. Unfortunately for all of us, great graphics is not just buying a good graphics card. Anyone that has followed the hardware benchmarking trials for a while knows very well that a PC is the sum of its parts. A good GPU is useless on a system with a slow CPU, feeble bus speeds, or little memory. Good graphics require a powerful GPU, a powerful (and nowadays multi-core) CPU, high data transfer rates across the board, lots of fast RAM and hard disks that don't get caught on their coffee break every other minute.
In other words, the Steam survey says exactly the contrary of what you claim : gamers are upgrading their CPUs in order to follow your graphics cards' needs, not to smother them.
To wrap this comment up, I would just like to add one more thing. The action the industry needs, Mr. Huang, is for you to pull out your finger and start doing serious progress in GPU technology instead of renaming last years' cards with a fancy new scheme and reselling your existing stock.
Make something new and wonderful for a change, I can guarantee you it will sell. And Intel won't be able to do anything about it.
No I think Intel would rather control the market without acquiring one of the major GPU vendors. The paranoia is more about their culture, what was it Andy Grove said about paranoia?
The problem for Intel is to maintain x86 hegemony once people start to consider alternate CPU/GPU architectures they have no inherit advantage over other firms. The GPUs can do massively parallel better than Intel can ever dream with their RISC architectures.
Transform computer industry
GPUs are very interesting technology. NvDIA GPUs with CUDA are being used in many compute intensive scientific and financial applications that are amenable to parallel execution.
I could imagine an inversion of the CPU/GPU relationship where the CPU becomes a GPU loader and coordinator. Essentially loading code and coordinating parallel processes. A lot of this assumes that software people figure out how to develop for massively parallel systems. I think functional languages like Erlang can help us get their. You basically want a language that discourages implicit data sharing and will implicitly exploit however many core are there without the developer having to do much.