Web 2.0 and Tim O'Reilly as Marshal Tito
Forward to the (distributed) revolution
Comment As the Web 2.0 bandwagon continues its rapidly accelerating path downhill towards the inevitable crash we find ourselves at another turning point in the development of the networked world.
Ten years ago we were faced with a choice between the controlled homogeneity of the ‘information superhighway’ or the many and various delights of the unsupervised Internet, and we chose wisely.
Now we must decide whether to put our faith in Ajaxified snakeoil or to look beyond the interface to distributed systems, scalable solutions and a network architecture that will support the needs and aspirations of the next five billion users.
The choice may seem obvious, but the pull towards the dark side is powerful.
Many believe that the best way forward is the one sketched out by Google, Yahoo! and Amazon as they offer tempting APIs and non-standard data formats to enthusiastic developers keen to add some scripted magic to even the most banal website. Others look to social networks, virtual worlds, user-generated content and the end of mainstream media as markers on the way to the promised metaverse, the land of prims and money.
The wrong question
If Web 2.0 is the answer then we are clearly asking the wrong question, and we must not be fooled by the cool sites and apparently open APIs. Most of the effort is – literally – window dressing, designed to attract venture capitalists to poorly-considered startups and get hold of enough first-round funding to build either a respectable user base or enough barely runnable alpha code to provide Google or Yahoo! with yet another tasty snack. We need to take a wider view of what is going on.
Back in the 1870s Karl Marx outlined the steps through which he believed a capitalist society needed to pass before it could reach socialism. After the revolution came the dictatorship of the proletariat, a painful but necessary stage of oppression and correction, during which the organs of the state would whither away as humanity achieved its true potential and coercion became unnecessary.
Web 2.0 marks the dictatorship of the presentation layer, a triumph of appearance over architecture that any good computer scientist should immediately dismiss as unsustainable.
The impossible dream
Yet this is the dream, one promoted relentlessly by Tim O’Reilly and his acolytes since the term was first coined in 2003. Thanks to their efforts, and those of other cheerleaders for the 'new paradigm', there is now a real danger that continued investment in Web 2.0 companies will turn O’Reilly’s dream into our nightmare.
If that happens then the oligarchy who benefit most from the stale socialising of Flickr and YouTube will have held back the transition to distributed systems just as the old men in Soviet Russia and the People’s Republic of China retained power and held back the transition to true socialism that Marx had predicted.
Fortunately, O’Reilly seems less of a psychopath than Mao or Stalin, and is perhaps closer to the pragmatic Yugoslavian leader Marshal Tito, who carefully steered a path between the USSR and the West for decades.
O’Reilly has already announced that Web 2.0 is really about business opportunities and new markets rather the emerging collective intelligence of humanity he preached from the barricades last year, so perhaps he will have the sense to move his followers away from Ajax towards something grounded in decent engineering.
If we can unlearn the lessons of the old Web and transcend its stateless protocols to achieve real distributed processing over a managed, trustworthy network then the possibilities truly are remarkable.
We can start to build hybrid applications that use modular code and distributed services, some local, some remote. We can introduce yet another level of abstraction – always the solution to any computer science problem – and get our codebase away from processor dependency.
We can build a network that doesn’t care or notice if your libraries are local or remote because the stuff you use regularly is always where you need it to be, cached on your local storage when needed, on a remote server when you’re online. And we can do it all without ceding control to Google, Amazon or even Microsoft.
If we sort out our interfaces and interactions we can may even be able to put our heads into the screen, be part of the metaverse, enter cyberspace and interact fully and equally with agents, people, sims and any other machine- or human-generated intelligence. But this will not happen if we follow the Web 2.0 fantasy and put our trust in cool but ultimately shallow tricks with the presentation of data. The time has come to stand up and be counted, and we need people who can count in hex and see beyond the Web 2.0 hype. ®
Bill Thompson is a technology critic and essayist. His website is here.
It's Perfectly Obvious
Even to a NOP like me it's perfectly obvious Mr. Riley wants his flying car, and, quite rightly, Mr. Thompson has reached out to grab the ignition keys from him. Mr. Thompson wants to know who going to pay for all the flying cars and he's damned if flying cars are going to swoop in over his garden unrestricted and uninvited.
Mr. Thompson, no doubt, thinks Mr. Riley should contemplate the error of his flying car ways. Critically important is the need for Mr. Riley to contemplate his waywardness in Milan's famous Cathedral,the Duomo, which was built especially for such occasions.
If I'm not making sense, well then, all I can say is it flowed naturally from reading the articles and quotes.
Well I'm a great believer for having a vision of where you want to go. I'm also a great believer in the bazaar being a tool for certain areas and the cathedral for infra-structural issues.
However, when something is new or fledgling - imposing or desiring to impose a method of operation rather than allowing a standard to emerge is almost always folly.
So have a vision, don't impose operating restrictions, allow the melting pot of human creativity compete with different ideas and then support the emergent standard if a standard is necessary or beneficial.
This is what is happening with web 2.0. A melting pot of ideas, from which standards are being rapidly created or adopted, for example REST over SOAP, Mapstraction etc
Of course this is a nightmare for any purist or committee that wishes to control or impose their own idea of how things should operate. Such evolved behaviour or evolved standards bypasses the need for committees or imposed computing standards.
You can almost hear the nashing of teeth as the purveyors of what is "right" discover they are irrelevant.
So we come to your tirade against web 2.0 and your description of Tim "Marshall Tito" O'Reilly.
Tim's vision or collection of concepts into his vision does nothing to state how things will operate. It is more a description of important concepts (from open data to commodisation of operating environments to social participation to rich interfaces to new business models to a data centric view) and describes a progression to a different type of web. It subscribes to the view that this is not dictated to but is emergent.
A "real" application is one that is used and is useful and has no bearing on language or transport protocol. As for supporting messages between distributed objects - well those are standards which will emerge rather than be dictated to.
Your general thesis is we should stop all this, as there is the real chance of it turning into a nightmare. That's a mantra against creativity, against the melting pot.
So is this a case of the kettle calling the pot black?
It's not Tim who is Tito ...
Who exactly is asking the wrong question here?
I'll make my comment brief:
You continually throw in "computer science" terminology into the picture, as if that's everybody's ultimate goal. You're the one asking "how can we solve these computer science problems on the web and make distributed systems ultimately work?"
Isn't *that* the wrong question?
The "how" is not what is ultimately important. "how" is something for a researcher to consider, not an application designer. As an developer, the job isn't to answer the how, but to use the tools available to get solutions that work. It's also about serving some end-point, be it those infatuated by cool AJAX effects on digg or those who feel the need to share their video content on YouTube. There's no need to bother them with "computer science", they don't know or care that your servers run on a "distributed system".
I think the most proper analogy is: You seem to think that web developers should invest their time solving the problem the same way a biologist will research a disease. But remember, no biologist has ever made, marketed and sold a single bottle of pain relief medicine. You're confusing two completely separate fields: research and development. They're two fields for a reason.