Personal Tech

AMD, Intel hate Nvidia so much they're building a laptop chip to spite it

Just months after Chipzilla trashed its new best friend as an 'unreliable supplier'

By Chris Williams, Editor in Chief


Analysis Arch-rivals AMD and Intel have put aside their animosity toward each other, and united against a common foe: Nvidia.

On Monday the pair revealed they are working on a chip family that will combine an Intel Core x86 processor, a customized AMD Radeon GPU, and HBM2 – high-bandwidth memory – all in one package. The silicon will, we're told, hit the market in the first quarter of 2018 in thin and lightweight notebooks, allowing them to, it is hoped, play top-end games.

Specifically, the chipset will use Intel's eight-generation H-series Core family, presumably from its Kaby Lake blueprints, and an AMD Radeon-branded architecture, either Vega or Polaris. Both Chipzilla and mini-Chipzilla are keeping quiet on the exact specifications until launch. It appears Intel is buying the GPU silicon from AMD, and using a driver from its new best friend, rather than licensing technology.

Crucially, Intel will use its Embedded Multi-Die Interconnect Bridge (EMIB) technique to glue the GPU tightly to the HBM2 RAM to increase performance and lower the power drain. The AMD part is then wired to the x86 system-on-chip as usual, presumably via PCIe.

It's quite possible the Core processor could keep its built-in Intel GPU, activating the beefier AMD Radeon part as needed when firing up a game, or similar, which would give the computer greater control over its power usage. The H-series is power hungry, has a basic builtin GT2 GPU, and thus with a decent AMD part strapped on could be a good fit in a gaming-friendly notebook.

EMIB essentially places two separate dies on a common substrate with buried copper interconnects transferring signals and power. It's a more compact, high speed, and energy efficient arrangement than discrete components talking to each other over a PCB, which is ideal for squashing the technology into a laptop. Intel has been touting this technique for a while, so what we're looking at here is the first consumer product using the layout. Intel's Altera wing uses EMIB in its Stratix 10 FPGAs.

Cross-section ... Inside the EMIB design. Click to enlarge or here for a video of it in detail (Source: Intel)

The project reveals just how much sworn enemies Intel and AMD hate rival GPU designer Nvidia, which also makes laptop graphics processors. Intel has its own graphics cores that it bundles into its x86 chipsets, and these aren't necessarily going to be chopped as a result of the AMD partnership: there will still be a place for them in Chipzilla's vast product range. No, Intel wants to put a bullet in Nvidia's dominance in the performance graphics chip world, even if it means teaming up with AMD, which has long accused Intel of abusing its virtual monopoly in the x86 processor space.

What just trousered a $4.5bn profit, has glum desktop chip sales, and rhymes with go to hell?


Nvidia has been a thorn in Intel's side not just on the graphics side but also the AI side. Chipzilla's attempts to beat Nv on hardware for running neural-network applications aren't really going anywhere – the Xeon Phi family is too complex or expensive to program and is not long for this world.

Intel is shelling out millions buying upstarts, such as Movidius and Nervana, to give it specialized silicon and software for performing machine-learning tasks to compete with Nvidia. And let's not forget the $1.5bn Intel has had to cough up to Nvidia for patent licensing.

Intel needs to give its sagging desktop offerings a shot in the arm, and that includes its laptop side, and AMD's Radeon expertise seems to be the best bet for it. Meanwhile, AMD and Nvidia have long butted heads as GPU designers. AMD will take Intel's leg-up to sucker punch rival Nvidia, although what that collaboration means for AMD's Ryzen Mobile chips, aimed at laptops, and the antitrust battle against Intel in Europe, is uncertain.

The deal will give AMD a deeper route into a world where Nvidia holds sway. The parts could end up in future Apple Macs; Intel has indicated the components will be available to a number of manufacturers. Apple is another company that loathes Nvidia, so a combination of Intel and AMD ticks a lot of boxes.

Speaking of hate, though...

This hookup between Intel and AMD is all very weird and exciting, and also bananas. Just a few months ago, in June, Chipzilla summoned journos to its campus in Oregon, USA, including El Reg, and showed us a series of slides that trashed AMD.

Now, we're used to vendors rubbishing each other behind their backs, but this was something else. This was an in-depth drubbing that had some scribes with advance knowledge of the presentation warning Chipzilla's executives before the briefing that it was a little much. It would backfire. We focused on Intel's processor design work in our subsequent write-ups of that meeting, rather than dwell on Chipzilla's thoughts on AMD's engineering decisions.

But now, seeing as Intel and AMD are chums all of sudden, let's revisit the trash-talking Chipzilla gave its new buddy behind closed doors in June, just a few months ago.

Back then, Intel wrote off AMD as having a "poor track record" and was an "inconsistent supplier." Nice words for a company you're buying GPUs from. It also accused AMD of lacking a software and hardware ecosystem. It also put AMD's Zen processor architecture on blast for providing "inconsistent performance" on desktops.

There was a big bunch of slides, but here are the best from that Intel briefing. You can click or tap to enlarge any of them. First off, here's the infamous claims against AMD's ability to supply components and provide performance:

And that it can't maintain an ecosystem:

And slagging off its memory designs:

And pointing out that virtual machines requiring many cores may end up spanning internal interconnects, slightly hitting performance:

Now, the above briefing focused on Intel's x86 server-grade chips versus AMD's x86 offerings in the data center – and you can read our take on AMD's Zen-based Epyc server chips here – rather than the Radeon GPU side, which is what the above collaboration is all about.

But it's more than just a little awkward that Chipzilla effectively threw its minnow of a rival under a truck, and has now picked its competitor up, dusted it off, slipped it twenty bucks for some GPUs Intel can't design itself, and told its tattered new best pal to smile for the cameras.

With friends like that, who needs enemies like Nvidia? ®

Sign up to our NewsletterGet IT in your inbox daily


More from The Register

Like the Architect in the Matrix... but not: Nvidia trains neural network to generate cityscapes

GPU biz also open sources PhysX and teases beefy graphics card

Nvidia just can't grab a break. Revenues up, profit nearly doubles... and stock down 20%

Ongoing Bitcoin woes left the channel holding all the cards, and that's not a good thing

Nvidia adds nine nifty AI supercomputing containers to the cloud

Now you can splash out on tons of GPUs if you really need to

AI caramba! Nvidia devs get a host of new kit to build smart systems

Kubernetes for GPUs, a PyTorch extension, TensorRT 4, and much, much more

AI, AI, Pure: Nvidia cooks deep learning GPU server chips with NetApp

Pure Storage's AIRI reference architecture probably a bit jelly

Try our new driverless car software says Nvidia, as it suspends driverless car trials

Post crash test hits share price

Looking to nab Nvidia's GeForce chips? You need cash and patience

GPU shortage equals four-month wait time for buyers

Nvidia quickly kills its AMD-screwing GeForce 'partner program' amid monopoly probe threat

GPU giant rails against rumors of stiffing sellers

Amazon's sexist AI recruiter, Nvidia gets busy, Waymo cars rack up 10 million road miles

Roundup Your two-minute guide to this week in machine-learning world

Nvidia shrugs off crypto-mining crash, touts live ray-tracing GPUs, etc

Roundup Also, how Apple's Siri uses your location to improve its speech recognition