Who'll keep taking Windows Tablets in the iPad era?
BillG's 2001 proto-fondleslab that failed
Andrew's Review Notes I have a lot of sympathy for people who steal their technology from the hearse, just as its driving through the gates of the great technology knackers' yard.
While it is obligatory to be savoir faire with the latest in design and innovation, when it comes to my personal spending I'm right there with the laggards, on the extreme right of the Technology Adoption Lifecycle.
You've heard of Early Adopters, and maybe Happy Hour Adopters. I'm a Last Orders Adopter. I made good use of a Palm long after everyone else had sold theirs. I used OS/2 when it was well and truly doomed; BeOS when it had secretly been de-emphasised by Be Inc in favour of kitchen appliances – the dot.com-version of the fondleslab.
I waited until the third iteration of the iPhone to leap in. I got into Ubuntu on laptops just before everyone decided "Good Christ, this megalomaniac has destroyed everything I like about my favourite Linux distro. Bye!"
There are many advantages to being a Laggard: the platform is usually as cheap as chips, and the bugs will have been debugged. It will probably play nicely with everything else. And you avoid all the pains of the early adopters.
I recall developing on what was DEC's first Alpha NT Workstation, when only a couple of dozen were in the UK. Alpha was brand new – and extremely sexy. And NT was so much part of the future every UNIX guru was privately learning to get a handle on the new system. Alas the Alpha didn't always boot – something to do with the keyboard driver, I recall.
And sometimes as an early adopter you just drive down a dead end. When I ditched the Psion for a Nokia Communicator in 2001 I lost the ability to type notes quickly, traded a great diary for a lousy one, and got a brick in exchange. It was a terrible decision – one really based on faith.
Now I see the same faith expressed by people who have begun to store their password files on Dropbox, and boast about how their passwords are always available thanks to The Cloud, and how this is the future, and we should all get with it. Er, good luck, guys ...
So I have some sympathies with the Windows Tablet community and I marvel at how enthusiastic and helpful they still are. It buzzes away at sites like TabletPCReview. Well, mostly it is TabletPC Review.
The Windows Tablet was one of two Big Ideas that Bill Gates spawned in Microsoft's world domination era, from the mid 90s on. The great Bill idea was to put a database in Windows and build the file system on top.
That was actually a great idea – as you old PICK and AS/400 veterans know – and with Longhorn in 2001 Microsoft set to work. But it proved so hard to implement, Microsoft had to throw it all away and start on (what became known as) Vista from scratch. I don't think that will ever be back – there's so much sunk in the architecture as it is. And who, exactly, complains?
Tablet PCs were Bill's other big idea. By 2007, predicted the great man, Tablet PCs would become "the most popular form of PC".
Next page: Dirty blond ambition
"I can't see Redmond buggering it up this time, but you never know."
Really, even with my eyes shut.
This is not simply MS bashing, I simply have an undying faith in the ability of people anywhere (and that includes myself) to get things terribly wrong in ways we never could imagine. This is a kind of inverse creativity embodied in Bergholt Stuttley (or Bloody Stupid) Johnson on the Discworld.
However, MS might get it beautifully right.
That's not quite correct
MS did get it wrong and it's demonstrable they had a lack of vision. I don't believe the "before their time" argument can be said to wash. MS launched tablets earlier than Apple and is still making them. So "their time" has traversed the time-frame during which another company has proven how an appealing product can be made. They should have had a learning advantage. So MS have, as the article points out, ended up on tram rails and manifestly failed on the vision front.
Also it's not correct to say Apple scaled up iOS from phone to tablet. The iphone actually came from an Apple tablet project. Apple were researching multi-touch UI's for years and always kept an R&D line open for NG UI's and interaction (I think I read somewhere this even goes as far back as to the days of the Newton). along the way Steve Jobs publicly noted on a couple of occassions how bad phone UI's were and evidently decided to adapt what they had for tablets to phones. I suspect that as the tablet R&D continued to progress he was quite amazed how the incumbent handset companies failed to rise to the chellenge of providing a good UI and decided to go for it.
One big factor determining the order of release of the products was that large capacitative displays were expensive (and difficult to manufacture), so it ended up being easier to deliver a smaller phone device first before the planned tablet. Also Apple were sure the user experience had to have capacitative touch technology to really work (Another failure on Microsoft's part was the failure of their R&D to reach the same conclusion). Clearly they were right. You have to admire the extent to which they refused to compromise. They knew what the market demanded and refused to launch into the space until they had exactly the combination of features they thought were required.
Apple also did a lot of research around the form factor. The 10" display came out as the clear preferential size (a conclusion that is now fully backed up with recent publicly available market research) and they waited to launch a capacitative display tablet until they could implement the ideal. RIM choosing 7 inches was really about cost and supply chain issues ( though they claim otherwise, but they would seeing as Apple grabbed the available lines of supply for 10 inch capacitative displays at iPad launch).
At the time of the iPad launch, others scoffed at Job's claim the iPad was cheap at the price. But Apple had done their research and prepared the supply chain logistics well. Others have only been able to match the price by shrinking margins to nothing just to get a toe-hold in this new market. This will change of course as display manufacture volume ramps up and Android makes gains, but for now Apple rules the roost.
Arm netbooks conpicuous in their absence
Aren't they? Computex 2009, Asus shows an Arm based netbook running Linux to be released Summer 2009.
Cue joint presentation by Asus, Intel and Microsoft iterating over and over again how important Intel's hardware and Windows is to the future of netbooks and Asus.
Asus Arm based netbook did you say? What Asus arm netbook? (The cancelled smartbook).
This year Asus have commited to releasing a Tegra 3 based netbook running Chrome by the end of 2011. I wait and hope.
Really realy want a netbook that runs all day without recharging and that has a friggin keyboard so you can actually do some work on it.
Meanwhile in 10 years we might get prood that Intel/Microsoft paid Asus and other manufacturers off so they don't defect to Arm and not-Windows, much like their past dealings.
Lack of imagination
Lack of imagination was the problem. It seems that people in charge at Microsoft are not forward thinkers, they don't seem to have any imagination when it comes creating something new. They are happy to clone other ideas and put their own spin on it once the competition has made a move.
When the tablet PC was being worked on there wasn't anything else to compare it with. So when key people (head of MS Office for example) at Microsoft were asked to adapt software to work with the new tablet input method they refused. It's almost like they can't see the merit of doing something unless someone else is bringing in cash for a similar product.
But this time there is a product they can refer to, they can made things happen as there's plenty of alternatives that are raking in cash.
The L word never had a problem running on ARM, anything from Android to a super computer. I don't think it's a problem of getting code from x86 to ARM and thinking smaller, it's just that all these companies write proprietary code that probably looks like barf with bits corrected in crayon.