Web 2.0: you're not even slightly whelmed
More DOA than SOA
Letters This month's Web 2.0 Conference has been heralded as the birth of a new technology frontier: by people who want to sell you tickets to Web 2.0 conferences, and by web designers who want VC funding.
We didn't coin the phrase Bubble 2.0 - The Economist beat us to that one by a fortnight - but we did suggest there were many other areas of technology that could usefully be improved first. (See ) Now here's what you think.
We also recommend Nicholas Carr's The Amorality of Web 2.0 which homes in on the New Age aspect - the web as transcendent religious experience - and Robin Miller's terrific confession that he was tempted to sell the VCs a blank CD, such was their eagerness to fund any old rubbish.
'Just as you can't build a house on sand, you can't build "a global operating system" based on a presentation layer and a few scripting kludges.' And I laughed out loud over the notion of CSS as a transmission protocol.
The self-congratulation of the digerati truly knows no bounds.
Bubble 2.0 indeed.
Brillant and spot-on....
Please write more!
Robert Shaw ITU Internet Strategy and Policy Advisor
The webcon people should probably be told that IPO's and overpriced buyouts by big tech companies are very much Web 1.0.
Loved what you said about Bubble 2.0.
Personally I feel these techno-love-ins are about as far removed from the reality of networked technology as its as possible to get. I make my living from networked services and the number one thing I run into every day is stuff that doesn't work. Just about everything is a kludge. A hopeless rag-bag or ancient code anjud other bits of crap slopped together into a finished system just because it has a shiny front-end.
With Flickr: no one can tell me that a company that was bought for $18 million has a backend infrastructure that cost anything like that. It's amazing it's as reliable as it ism considering the whole thing is probably hanging together on cheapo hardware and all kinds of hacked-together scripts that fix *that day's problem*, that ends up as such an enormous legacy of badly written code that almost no one knows how it works and no one knows how to render it back down to a cohesive whole again. And these problems are prevalent in commercial products, free products, open source products - you name 'um and there's a Titanic-sized software engineering problem that goes with them. We've created such a monster, such a headless goliath built on quick-sand that we're all sinking without trace. Everyday users are confronted with the same old problems we never resolve.
But let me get back to the Utopians. Hardly anything they suggest is capable of challenging the real problems our species faces now. An army of Bloggers is not going to stop the ice caps from melting. But seriously, outside their pampered middle-class lives so much of this drivel has hardly any profound application. Take online voting. Just supposing someone cam up with a perfect, fraud-free and simple way to do it. Then do we suppose it would actually improve the credibility of the democractic process by one iota? Edward Thompson said in the New Statesman more than 25 years ago that elections were "just a theatrical show of legitimacy" so all we'd end up with is crappy leaders, faster, because no one has went back to re-examine the substance.
The Internet: it's good for news, current affairs and guerilla-journalism, sure. What else? An armchair rummage-sale courtesy of eBay? Tracking down celebrity-skin from Google? Badly encoded MP3s from eDonkey? Downloading porn and warez from newsgroups? It's hard not to feel utterly cynical about the whole thing when you see Skype making money from infrastructure it didn't invest a cent in or more hugely overvalued dot-com cowboys starting gambling sites to rip more people of on a scale that would even made Vegas casinos blush.
Sometimes I find myself asking one question: is this kind of technology itself the problem? Are we really convinced it has anything substantial to offer other than peddling even more rubbish? If you ask me the technology at our disposal now has decades to go before its anything like as transparent, reliable and purposeful as it could be.
On the issue of Balkanization, Bill Nichols is more optimistic. One size doesn't fit all, he argues, but
There may be another way to look at this.
Your comments about the deficiencies of Internet 1.0 are dead on, but certainly not the fault of the originators in 1968 and 197x. Our use is outgrowing the design of the original net, with somewhat predictable results. QoS, priorities, filtering, etc. are all adaptive attempts to remedy these limitations.
The high speed Internet2 used by science and government is slightly different than 1.0, but more specifically, limited in who connects for what currently. It will probably remain that way.
What will be the result? We will eventually realize that not only does one size not fit all, it *cannot* fit all. We are in the process of building multiple nets that will interconnect at well defined points, not everywhere, with well defined protocols and limits. It must be that way to serve the separate interests.
There are security, performance and access requirements that a single net can only approximate, at considerable expense and overhead.
Yet it will not result in Balkanization, but speciation. Nets will be optimized for classes of traffic and classes of applications. They will be invisible but there if you need them, efficient for specific purposes.
Overall, a network of nets will be more robust and more efficient than one master net that has to either warp itself or warp traffic to work. One net cannot work well with all these new demands, therefore we are seeing sprouts, seedlings of a new growth that will deliver the MultiNet(C).
I think Google's dark fiber may be the first step in this direction.
And right into the nitty gritty, Adesso's Phil Stanhope takes issue with many of the technical assumptions behind Web 2.0. Like the fact that we're always on, all the time, with a flawless broadband connection:-
Andrew, Well said.
I sat in a workshop a few weeks ago with a collection of Global 500 CIOs on SOA. Interestingly, they understood that SOA/Web Services was simply when you boil away the hoopla, just an interface API. That most web-services are simple encapsulations over other (older) systems. That there is a huge difference between offering a beta service (perhaps indefinitely) in the consumer space, versus an intranet service that allows composite applications to be created in ways not anticipated in advance. But once those apps are created they become an instant legacy.
Those of them who'd had enough of an internal ecosystem of web-services up and running were now tackling the inevitable: change management and version control problems. Yes, good problems to have, but nonetheless, old problems. You can't have the Flickr experience once you have an internal application in production that sales/customer service are depending on. In many many ways, SOA as we currently have it is simply interface-based componentry where we've solved the transport/serialize/deserialize problem.
We do deeply believe in rich clients. And web services. And SOA. But there are many misconceptions out there. If you're dumb enough to deprecate a method in a web-service ... or add another one but change the semantics ... or have a required semantic usage pattern that can't be readily conveyed in WSDL ... well, is it really any different than doing CORBA/OMG, DSOM, DCOM, COM++, or Enterprise Beans? No. Are people doing this? Yes. Is it better to get a SOAP exception that fails to be able to call a method in a service? Or is it better to never be able to compile/link in the first place?
There used to be a problem called "DLL Hell". How is changing a DLL after the original software shipped any different than revising a web-services interface and having no way to change the code that called the previous one? It isn't.
[This was written before the MySpace worm knocked out the site. That was one guy's first AJAX app, it took him a week to write, studying just one hour a day - ed]
My favorite way to show that Google depends on deep knowledge of the browser that it's trying to deliver bits to is to try and load news.google.com over a GPRS connection on a PocketPC Phone. It simply fails after a long-period of time (i.e it can DNS resolve, it can start to receive some bits from the source, but then it simply never loads). Hit news.bbc.co.uk (or theregister.co.uk for that matter), and guess what you get? An actual web page delivered to you that you can read. Wow ... what a concept!
LOCKSS is interesting in many ways, but it just shows one example of a valid use case. At Adesso we focus on another one only delivering the data to your devices that you need. Now, based on rules, that may be all the data you have on your other devices. Or it may be all that you need on a PDA. Or it may be all that you need to process your field service tickets.
What I object to about the Bubble 2.0 hallucination is the presumption that we're always on, always connected.
Phil Stanhope CTO, Adesso Systems
When I see "Bubble 2.0" I can't help but think of the real estate market! Anyway, I read your article - since you discuss such a wide range of topics I think it might have been better to split it up into multiple articles, but I found many of the things discussed interesting.
I agree Marketing/PR departments shouldn't decide how to build products. As a fellow Californian I'm not sure if your stereotype of us being utopians that think the Internet will solve all our problems applies.
No fear, there are many here in California who build real systems, and have very little time for Web 2.0 crapola.
And finally Adrian Burdess has a suggestion: Save The Utopians!
The starry eyed sun stroked utopian's are a lovely bunch of people. They are a unique representation of a breed of the human race seemingly psychologically incapable of any form of cynicism what so ever and like any rare curiosity they should be preserved for study in their natural environment.
If we don’t act now we stand to loose these rare and beautiful creatures and technology gives us options we never had in the 60's when the last bunch died off in the pandemic that was the 70's.
I suggest the best place should be some form of "Nature Reserve" or "Wildlife Game Reserve" where we can study these unusual creatures remotely from inside hides and from behind fences so as best not to interfere with their rather cute almost delicious society. Of course we would have to carefully restrict access to this "Reserve" from the outside whilst ensuring that the Utopians were able to inter relate without obstruction, barrier, impediment or most importantly, any lack of potential trust.
These "Trustbased Reserve's" would help ensure that they continue to exist for many generations by "Walling" them off from periodic outbreaks of cynicism and worse the more common and insidious disease, realism.
Awesome. Thank you. ®
Sponsored: Hyper-scale data management