Apple turns to Intel for low-end laptop graphics
Mole alleges snub for Nvidia
Apple is said to be gearing up to use the next generation of Intel graphics in its MacBook and - probably - MacBook Air - laptops in place of the Nvidia technology they currently use.
So claims CNet, citing unnamed moles - whether at Apple or - more likely, we'd say - Intel isn't made clear.
Intel's new graphics tech will be embedded into its 'Sandy Bridge' CPUs, which are due to be unveiled at the Consumer Electronics Show (CES) next month.
Intel detailed the Sandy Bridge family's core technology at Intel Developer Forum (IDF) in September, claiming the chips would offer a big jump in graphics performance, in part because the GPU is now part of the CPU die, rather than a separate in-package unit.
All well and good, but intel has made big promises for improved integrated graphics performance and largely failed to deliver them in any meaningful way.
Even Apple knows that, which is why criticism of its current low-end Macbooks' use of Core 2 Duo processors - rather than brand spanking new Core i chips - was addressed with comments to the effect that users will get more performance not by a better CPU but by a better GPU.
Indeed, Apple's decision to add in Nvidia graphics rather than Intel's - both integrated, the former in the chipset, the latter in the CPU - did indeed lead to a noticeable performance improvement. A case in point: this reporter's 11.6in Macbook Air, which uses a lesser CPU than the first-generation Air, is nonetheless snappier because it has Nvidia integrated graphics rather than Intel's alternative.
So if Apple has indeed committed itself to Intel graphics, we hope it has checked to make sure the CPU and GPU enhancements Sandy Bridge offers really are superior. We look forward to being impressed by Sandy Bridge's graphics - for Intel GPU technology, it'll be a first.
Still, integrated graphics have their limits, and the CNet report suggests Apple won't take this approach with its MacBook Pros, which will use AMD discrete GPUs, according to the mole. ®
Wait and see
Probably makes more sense to use AMD's integrated offering now that the low power versions are available and continue to differentiate at high-end with discrete. Having AMD, Intel and nVidia as suppliers should help with price negotiations as well: Apple doesn't need Intel's engineering expertise as much as it did a few years ago.
Of course, there is also the possibility of using the Power VX stuff from the iPhone for graphics. The really would put the cat in amongst the pigeons!
A big leap in GFX performance
Firstly, I have heard that before from intel and it never amounted to much.
Secondly, when you are coming from such a low base (as intel are) then a "huge leap" doesn't necessarily amount to "great", "good" or even "acceptable" performance when compared to others.
Thirdly, it is going to take more than putting the CPU and GPU on the same die to make intel a contender in the graphics market.
Honestly, if I were intel, I would be seriously looking at purchasing nVidia* and just stop with the pretense of building their own sub-par graphics chips.
They are simply not capable of doing much other than massage their ancient x86 platform. Almost everything they try outside that sphere has been pants.
* Not that I personally want that to happen. I like nVidia cards and I would hate to have to move to AMD/ATi just to avoid purchasing stuff from intel.
"huge leap" / "great" / "good" / "acceptable"
IIRC Intel reckon their integrated graphics are "good enough" (at least according to some guy at AMD!):
The most amusing Intel-bashing came from Rick Bergman, senior vice president and general manager of AMD's products group, who took issue with what he claimed was Intel's identification of its graphics performance as "good enough".
If they're already "good enough" why bother with anything like a "huge leap", eh, Chipzilla?
I suppose people are just spending all that money on nVidia and ATI graphics on the off chance Intel graphics aren't "good enough".
Pint - its Friday
Intel graphics chipsets aren't that bad
Whilst Intel occasionally make claims about the gaming performance, that has never been their real focus. Their graphics chipsets provide decent 2D acceleration and video decoding with low power usage.
For laptops and general office productivity their chipsets are ideal. I have Intel chipsets in my work systems and my laptops and with few exceptions don't want/need more power. My home desktop which plays games as well as productivity, is of course a different matter.
Old MacBook Air
The old MacBook Air used to overheat and underclock itself for thermal reasons, that's why the new ones are faster for CPU-bound tasks. Graphics chip performance plays a very small role in computer performance unless you are doing 3-D gaming. Sure, interface elements in OS X (and now Vista) are drawn as texture mapped polygons, but you are only going to be looking at maybe a couple hundred of those on-screen at any given time, and any graphics chip made in the last 10 years will be able to handle that no problem.
Actually, if you look at the Hackintosh community, it is fairly common for people to not be able to get 3-D acceleration to work, so the OS will rasterize everything in software. It's noticeably slower for doing animations like the dashboard fading in/out and Expose but otherwise it's barely noticeable.
So, the importance of graphics performance is completely overblown, with exceptions for gaming and potentially using the GPU as a general purpose compute engine... not sure if any mainstream software does that yet though.