Why Google and Amazon could end up cooking their own chips
Good lord, 'tech' firms doing technology!
Open ... and Shut It used to be that the ante for being a serious player in the technology game was your own data centre. Or several. On this basis, Google, Facebook, Amazon, Microsoft and very few others have constructed hugely expensive data centres that centre much of the web's activity on them.
But among these web giants, it may no longer be enough to just have expensive data centre. Today's ante is increasingly the chip.
It was perhaps not a huge stretch for a hardware company like Apple to get in the chip business as it sought performance and margin gains on its iPhones and iPads. Initially this just meant that Apple started designing its own A6 processor, while still having Samsung actually manufacture the chips, along with some design help.
Now, according to a report in The Korea Times, Apple is poised to completely dump Samsung as a design partner and use it solely for manufacturing. This is helped by Apple's hiring of Jim Mergard, a chip design veteran who spent 16 years at AMD but was most recently employed by Samsung.
Apple's relationship with Samsung, in other words, is getting more distant all the time. As one Samsung source describes it:
There are three kinds of chip clients. Some want us to handle everything from chip design, architecture and manufacturing. Some want us to just design and manufacture. Some want us to just make the chips. Apple is now the third type.
This puts Apple in a position to move production to another fab, most likely TSMC, with which Apple recently signed a deal to produce quad-core processors for future devices.
But this isn't just an Apple story. Few would question Apple's willingness to go to great lengths to own as much as possible of its supply chain. But Amazon?
The Next Web reports that Amazon is considering acquiring Texas Instruments high-end processor business. As with Apple, the purpose would be to take more control of the chips powering Amazon's increasingly potent mobile business.
As some point out, however, this isn't exactly a match made in heaven. For example, Amazon doesn't push enough volume of devices to make the deal pay for itself. At least, not initially.
But what if Amazon CEO Jeff Bezos sees a future consumed with digital content? Well, in that case, he may not be able to afford not owning chip design on a grand scale, just as owning data centres has come to be essential.
Is this where the industry is heading?
Years ago, Tim O'Reilly warned the web giants, including Microsoft and Google, to: "Do what you do best, link to the rest." Worried about a future where the internet ceases to be the platform and is replaced by vendor-specific platforms, O'Reilly urged these tech titans to remember the power of the Internet operating system, comprised of "small pieces, loosely joined."
Well, that may have been where the web started, but it certainly doesn't feel like where it's going. Owning a data centre is a way to amass, store, and crunch massive quantities of data. Mobile devices, powered by chips, are increasingly the top data collection and dissemination point for these cloud services. It may be advantageous to own both, and not just one or the other.
At least, this seems to be the thinking as Google, Amazon, Microsoft, and Apple all get into the hardware business. This started just with the devices themselves, but is quickly becoming a matter of chips, the "brains" of the devices, too.
In other words, it's Compuserve 2.0 at web scale, and it's unlikely to be what we as consumers want in the long run, even if it makes our lives easier in the short run. ®
Matt Asay is senior vice president of business development at Nodeable, offering systems management for managing and analysing cloud-based data. He was formerly SVP of biz dev at HTML5 start-up Strobe and chief operating officer of Ubuntu commercial operation Canonical. With more than a decade spent in open source, Asay served as Alfresco's general manager for the Americas and vice president of business development, and he helped put Novell on its open source track. Asay is an emeritus board member of the Open Source Initiative (OSI). His column, Open...and Shut, appears three times a week on The Register.
Nothing about this surprises me.
ARM has been making huge in roads into chip design and manufacture over the past several years. They have quality designs, great performance and with the capability of essentially "white box" labeling them, it absolutely makes sense for companies like Apple and nVidia to use ARM instead of buying from Samsung / Intel.
Which, is essentially what's going on. These companies are licensing designs from ARM, then sending them off to the "best" fabricator they can get. Of course "best" is defined by some combination of production quality, price etc.
Point is, they aren't actually "making" their own chips. Instead, they are just licensing the designs with minor tweaks and sending them to production. The days of hiring one firm to handle all of that are going away.
Re: Samsung - designed to fail.
Saying that Apple is using A9 cores is quite disingenuous is it not?
I could just as truthfully say that they are already using A15's.
In reality both, and neither, of those are true.
In my world A15 shows interesting promise for certain computing workloads, but I can't help but feel that Apple did precisely the right thing by incorporating what is good about A9 and A15 in their own package that is laser focused on power efficiency and performance as hand in hand requirements.
My guess is that, at least for a time, others (including amazon) may decide they need the same design latitude in order to drive excellent user experiences.
History repeating itself?
IBM tried to lock out other vendors from its mainframe biz back then.
Microsoft also tried to lock out other vendors from Windows (most notably Word/Excel competitors).
Should be interesting to see what happens going forward...
Give it a couple more months and buy AMD once their management have finished crashing it into the dust.
Re: PowerPC Macs
PowerPC perfomance per watt was and still is inferior, Steve was absolutely right to dump that shit