Google versus Facebook: stop your photocopiers
Battle beyond Windows, Jobs, and Linux
Open...and Shut The desktop is dead. Just ask Microsoft and Apple. Or, better yet, ask Facebook and Google.
Sure, we still use our desktops and laptops, mostly Windows PCs and Macs. What else would we use to draft our faxes? But the industry has moved on, and the petty squabbles over Windows vs. Mac vs. Linux no longer resonate like they once did. The twentieth-century desktop has given way to a new breed of "desktop" platform.
It's called the web. Have you heard of it?
This shift from local bits to web bits derives in part from the market leaders' inability to get excited about their desktop products anymore.
Don't believe me? Think about the last few releases Microsoft and Apple have given us. Microsoft didn't herald Windows 7 as a major breakthrough in computing, but rather sought to placate CIOs with the slogan "This time, Windows actually works!"
Nor is Microsoft alone. Apple's Snow Leopard, released in August 2009, was little more than a service pack for Leopard, released two years previously. Neither release broke significant new ground in computing.
You have to go back to 2005 to find a time when Apple and Microsoft sought to convince the world that they were making waves with their desktop operating system releases. Apple then challenged Microsoft to "start your photocopiers" but the reality is that neither has done much to appreciably push the state of the art in the last several years.
There are rumors that Apple may be set to change this with things like embedded projectors in its Macs, but even these veer toward hardware innovation, not software innovation.
Of course, much of Apple's innovation has moved to iOS, its mobile operating system. Plenty of developers have followed Apple (and Google) to mobile OS development. Android seems to be gaining the upper hand, as the chart below shows, but there's plenty of overlap between the iOS and Android developer communities as you can see from the chart below.
But this detour into mobile OS development is temporary. Longer term, web development will win.
Why? Because the web offers the broadest possible audience for a developer. The web has become more mobile and much more social in the past few years, and points to the next generation of personal computing powerhouses: Google and Facebook.
The two companies increasingly spar over online advertising and social networking, with each encroaching on the other's turf: Google pulling a Facebook with "Google Me" while Facebook potentially offers a more compelling advertising model.
But the feud is felt most poignantly in their recruiting, with Facebook seeming to steal more top-tier talent from Google than the reverse. Most recently, Google's Chrome OS lead, Matthew Papakipos, dropped the search giant for Facebook, perhaps pointing to Facebook's ambitions to broaden its reach as the industry's default web platform.
Apple, though currently on a tear, doesn't make the cut, in large part because its curation model ("Father Jobs knows best") can't scale to meet developers' needs. If Apple must get involved every time a developer wants to create or update an app, it simply won't be able to compete with Google's more open approach.
Apple also limits its developer audience by tying developers to its hardware. No matter how much Steve Jobs may wish otherwise, some hapless souls will buy iPhone alternatives. Developers need to reach these people too.
Meanwhile, Facebook offers developers much more compelling economics, as Gartner points out, which is likely to result in more development moving to Facebook over time.
Importantly, neither Facebook nor Google care which desktop (or mobile) operating system a person is running: they work across disparate platforms. As noted, this gives them the broadest possible audience, both in terms of developers and consumers.
As such, the desktop is increasingly one launch pad among many to Web "operating systems" like Facebook and Google, which don't seem too bothered by OS X, Windows, and Linux relics of a bygone age. They're enabling developers to dis-intermediate the device and write applications directly for the Web.
And that is why they'll win. ®
Matt Asay is chief operating officer of Ubuntu commercial operation Canonical. With more than a decade spent in open source, Asay served as Alfreso's general manager for the Americas and vice president of business development and he helped put Novell on its open-source track. Asay is an emeritus board member of the Open Source Initiative (OSI). His column, Open...and Shut, appears every Friday on The Register..
Pardon me if I sound cranky, but...the desktop isn't dead and it's not going to be any time soon. There are a couple of problems, both with web based services and the hardware that provides access to them. I don't know that either set of problems can be fully overcome.
Although the technologies have changed, the concept of people wanting a real live computer hasn't. The computer has, of course, changed shape as mobile phones, tablets and whatever else have gained processing power and shrunken in size. Even so, the desktop (and laptop) computer with local applications in place still dominate the computing scene because there are just some things that are better at large amounts of entry than others. Unless you had an external keyboard, I dare say that you'd rather not type your great novel on an iPad. All of that is pretty well known.
History also shows that despite repeated attempts, terminals and thin clients that pull their computing resources from afar haven't really been successful after the personal computer made the scene. And for all of the improvements in technology today, I don't see this changing. This also seems to be pretty well known and (I'd dare say) accepted.
If anything, the interoperability of software has improved and the OS has paled in terms of importance. It used to be that I had to buy a certain type of computer and operating system to run a given program. Today the operating system has paled in importance, as it should have. I don't think this has quite as much to do with the web as it does with the advent of software that's of high quality and readily available cross-platform. I finally can reasonably say that a lot of the software I want to use is available on whatever platform I might have.
Maybe I missed something, but I thought that the best thing Microsoft /could/ say about Windows 7 is that "This time, Windows actually works!" Apple said long ago that Mac OS 10.6 was going to be a quality improvement release, paving the way for future generations and giving them the chance to clear out some accumulated junk.
I don't fully trust Google. I don't trust Facebook at all, and refuse to use their services. In fact, I don't fully trust any web based service with all the things I'd normally store locally. Why, then, would I want to entrust the things on my computer, things that I created myself--some of which should ideally or must remain private--to a third party? I don't know what Google does to keep data secure, and I don't place it beyond them to look at the things stored within their network, even if it's only in the course of an employee doing their job. I do know that even the best of web applications developers can't see every possible hole or flaw, and that they are competing against a nearly immeasurable audience of people who will try to pry where they should not to see what secrets the system will give up.
(Whew! Sometimes even a Model M needs a break...)
All of that says nothing about what happens when your connection to the 'net goes down and you need your data.
And *that* is why the desktop computer, with its local storage, applications and memory isn't going anywhere any time soon. Web based services have their use, but I maintain that they are not and can never be a 100% replacement for the full fledged computer and locally installed application. It's more a (sizeable) niche that will appear in the market than a "win".
Your Head In The Cloud
Ah ... no.
The problem with depending on the Web for everything is a matter of trust. Do you really trust these people with your data? With your day-to-day operation? I'm sorry, but I don't. Microsoft, Apple, Google, et al are all cut from the same cloth, willing to sell their own grandmothers (or your privacy) for a buck. I trust nobody but myself to look after my own interests. I keep multiple layers of paranoia between myself and the Web in general, and multinational, multi-billion dollar corporations in particular. And don't get me started on how far any *government* can be trusted.
The bottom line is that desktops and laptops are here to stay, and while the Web has become an integral part of the computing experience, it is merely a component of that experience, after the fashion of a Firefox add-on, and not the center of the computing universe. The industry "leaders" may have moved on, but the users haven't. What we've got now works very well, thank you.
Anyone who relies on the web for anything of any permanence is a complete and utter fool.
Thinking about it, maybe that proves you right.