Optimus does this by using monitoring software to see what you're running and, according to a pre-determined policy, switch GPUs on the fly. Checking your email? Stick with the integrated core. Fire up Far Cry, though, in the discrete GPU immediately takes over.
Without any interruption to the display.
Under Optimus, the IGP always manages the display
Here's how. Past dual-GPU solutions used a mutliplexer chip to relay the feed from each GPU to the display. It was the mux chip's switch from source to source that briefly stopped the flow of data from the display engine to the screen.
Optimus pulls the mux chip right out of the picture. Instead, it uses the integrated GPU as the sole source of screen data, and since the IGP is never turned off, there's no break in the what you see on the screen.
Rather than drive the display directly, Optimus uses the discrete GPU - it has to be an Nvidia part, of course - to tap into the IGP's frame buffer and render the off-screen image on the IGP's behalf. The rendered image travels from GPU to IGP's main memory-based frame buffer over the laptop's PCI Express bus.
It's a trick made possibly by Windows 7, which allows a GPU driver to route rendering jobs to other processors when it knows they're available. It runs both drivers simultaneously, even though one may not be doing any work.
The reliance on Windows 7 tech means there's no reason why Nvidia's rivals, like AMD's ATI division, can't create an Optimus of their own. Nvidia is patenting the software techniques it uses to decide which app runs on which GPU, and says it owns the technology used to blit the rendered frame out across the PCIe bus to the IGP frame buffer.
The discrete GPU is only powered up when it's needed
Haas says that the process of engaging the discrete GPU takes "just a few" CPU clock cyles. Certainly, when he demo'd the technology to Reg Hardware the jump from IGP to GPU was entirely smooth.
Laptops with 2 GPUs
I'm typing this on such a laptop - a Vaio SZ1VP, which I bought 4 years ago. Run the higher end graphics card when plugged in to power/not on my lap (gets hot)/playing games/need to use DVI connection. Switch to onboard Intel graphics when away from power supply, for MUCH longer laptop life. Works very well, although my laptop require reboot.
So instead of running GPU A *or* GPU B, now you run GPU A alone, or GPU A *and* GPU B.
Extra 5% reduction in battery life on FarCry ... just what everybody wanted!
Whole frame buffers flying across the PCIe bus sounds like fun too. But it's OK, no-one was using that bus anyway.
Some mobile phone multimedia chips use run domains and only switch then on when needed. So there are decode/encode domains, camera ISP domains, 3D domains, mp3 domains, that are only turned on when required. One chip I have knowledge of runs at 500mA, doing all of the above....... I've always wondered why desktop graphics don't do much the same.
Want to make big bucks?
Just make a laptop with a gamer's desk-dock-call-it-whatever-you-like expansion chassis.
Keep the hot, demanding GPU in it, heck, put 2 in. Can even chuck in the big capacious hard disks there.
Here's a wild idea. Maybe even actually have 2 discrete CPUs, one ULV core on the laptop and a grunt heavy lifting one in the chassis, like an i7 or something.
in the mobile form, the actual laptop itself, just the frugal bits, SSD. 8 hour battery life would be good.
Make it cheap.
I'll buy one.
Laptops with hardware switches to switch GPU's?
Am I the only one here that's never seen or heard of these laptops that had multiple discrete GPU's that use a hardware switch to switch GPU's??? Can somone enlighten me please? Model/Manufacturer? Sounds to me like they are fixing a problem that never existed. Well done.