In conjunction with the VGX features embedded in all Kepler GPUs, Nvidia announced that it is working with online gaming companies to create a cloud platform for online gaming akin to that which OnLive has created with its own proprietary video compression and rendering chip.
This cloudy gaming platform, called the GeForce Grid, is not something that Nvidia plans to build and operate itself, but is rather a variant of Kepler GPUs and VGX virtualization, plus other gaming software code, that will allow a single Kepler GPU to handle up to eight game streams and beam them out over the internet to PC clients while doing all of the rendering for the game on back-end servers where the game is running.
The amount of power consumed per stream will be about half of what you could do using Fermi GPUs, according to Huang.
Another dollop of secret sauce in the GeForce Grid is fast streaming, which captures and encodes a game frame in a single pass and does concurrent rendering, reducing server latency to as low as 10 milliseconds per frame. This reduction can more than make up for the increased network latency of having your console in your hand – be it an iPad, a smartphone, or a PC – where you are playing a game tens or hundreds of miles away from the server on which the game is actually running and being rendered.
Online gaming company Gaikai, an OnLive competitor, worked with Nvidia to demonstrate Hawken running streamed from its data center 10 miles away from the San Jose convention center, on an LG Cinema 3D Smart TV with a wireless USB gamepad. To our eyes, it looked like it was running on a local console, with no lags or jitter.
As you can see from the chart above, Nvidia and its online gaming partners think they have the latency issue licked when the GeForce Grid software is running on data centers that are not too far from players.
Interestingly, the speed of light is an issue: it takes 100 milliseconds or so to circle the globe under the best of circumstances, and several times more than that going through networks. But the idea is very simple: "What's good for gamers is good for Nvidia," as Huang put it. And Bill Dally, chief scientist at Nvidia, put it even better: "A cloud allows a kilowatt gaming experience on a 20-watt handheld device."
With cloud gaming, you don't have to buy a console – and you don't even have to buy a game to play on it. In fact, Gaikai is distributing games through retailing giant's Walmart.com. You can just go there and click on a game, pay for the right to play it, and off you go. No downloading for 36 hours, no console to buy. All you need is internet access that is fast enough to stream HD video – basically, if you can watch Netflix, you can play games online with GeForce Grid.
"We are not currently planning on hosting the servers ourselves," said Huang. "But if it makes sense, we are not against it."
The plan is for gaming companies and telcos to build their own GeForce Grids and come up with the pricing and billing, and the desire is to get a Netflix-style price of around $10 per month for access to a fat catalog of games. The game makers get out of manufacturing media and packaging, and the gamers get instant access to games on the cloud.
The trick is to make it ubiquitous – and, of course, more profitable. ®
Re: I call bullshit
Although you can see an easy measurement of ping, you can't see the other "hidden" lag.
The ping is just the time for a single packet (maybe not even a game packet but an explicit ping packet) to reach the server you're playing on (and return, I assume).
You almost certainly lose 1/60th of a frame no matter what anyway, because of screen buffering and game programming techniques. That's 16.7ms. Then your mouse is probably optical and USB - that could easily add on the same number of milliseconds between you moving it, it being sense, processed and got to the main CPU for it to act on it (but only when it next hits an event loop which could be half the above quite easily - so another 8ms or so!).
Then the time to send the image down the HDMI cable and the HUMUNGOUS time the LCD might take to process it (even a 5ms full-dark-to-full-bright time means nothing as a lot of modern LCD's "buffer" the screen updates even further inside themselves so although technically true, you could be 2-3 frames behind what you're think you're supposed to see - yet another 35+ms!).
There's more to responsiveness than ping time. A lot more. But ping time is easily measurable without any human bias at all. The rest aren't. There's no way to reliably measure just how long it takes you to respond to a screen change without taking a reaction test. And there the greatest error contributor is your brain processing and nerve response, which swamps all these other factors anyway.
I have run game servers (CS 1.6 etc.) for years. I consider 100+ ping unacceptable personally, and 200+ unacceptable on anyone entering the server. But my reaction times in even the quickest test of pressing a mouse button when you see a dot swamp anything that my PC could be waiting for from the network. What they are saying with the "150ms" measurement is that there's an awful lot of other stuff other than ping that affects perceived responsiveness in the average gamer's setup. But it still doesn't mean that they have solved those problems themselves or that their system isn't liable to that same 150+ms "technical" latency.
No, it's not true that if you can stream Netflix then you can stream games. Netflix is unaffected by latency and very little affected by lost packets or jitter. Games are horribly affected by both. Those charts which show the Grid outperforming a home console assume 30ms latency, which I've rarely seen in real life. I for one never get better than 150ms, which is more than enough to put a home console way back on top.
Re: I call bullshit
150ms? Luxury, kids today don't know they're born. When I were a lad, we were up at half five, tying the packets to rats, 'cause we couldn't afford pigeons and even if we could the rats would've et 'em.