Original URL: http://www.theregister.co.uk/2006/04/29/oreilly_amazon/
WS-* vs the REST
O’Reilly, Amazon talk web service standards
At Microsoft's Mix06 conference last month, we took the opportunity to speak (separately) to Web 2.0 guru Tim O'Reilly and Amazon's web services  evangelist Jeff Barr on how they see the REST vs WS debate.
O'Reilly on the debate
There’s always a place for both. I don’t mean to suggest that all that other stuff (WS-*) is a waste. I do think that there’s almost always a premature effort to standardise. Donald Knuth, the famous computer science professor, said that premature optimisation is the root of all evil. He was talking about program performance, but I think that principle applies. When you get a standards committee together and they think about everything that might happen and they try to specify it, they inevitably get a lot wrong.
I think there are also some political aspects. Early in the web services discussion, I remember talking with Andrew Layman, one of the SOAP architects at Microsoft. He let slip that it was actually a Microsoft objective to make the standard sufficiently complex that only the tools would read and write this stuff, and not humans. So that was a strategy tax that was imposed by the big companies on some of this technology, where they made it more complicated than it needed to be so they could sell tools.
It’s not necessarily just Machiavellian scheming. I think Microsoft really believes that you can create better user experiences with tools that give people so much more power. But it is also a business strategy, whereas a lot of innovation comes when things are simple enough for people just to try them out and jam against them, if you like, to use a jazz term.
If you look at a lot of innovations in the computer industry, they come when something is simple and the barriers to experimentation are low, and not from big company standards-driven top-down initiatives. My point is that it is really too early to build some of the things that people have been trying to push out as web services, and right now simpler is better, because people are just figuring stuff out from the bottom up.
What of the idea that simple approaches like REST are fine for non-critical websites but won't work in the enterprise?
I think it is misguided. Doesn’t that remind you a bit of "PCs will never work in the enterprise, the personal computer is a toy?" I feel that disregarding the simple bottom-up stuff is a recipe for companies to fail. Simple bottom-up stuff is a driver of the future. You need to figure out how to layer more complexity on top of that and add value to it, rather than present this big heavyweight solution that is theoretically better.
The Amazon view
O’Reilly’s remarks are supported by Amazon’s experience. Amazon is something of a web service pioneer. In July 2002, the company introduced ecommerce services that enabled affiliates to access the Amazon database via an XML API. More recently it has delivered generic services for applications developers, covering message queuing, internet storage via a service called S3, and the innovative Mechanical Turk, a web service that links calls from applications into tasks carried out by humans. Amazon provides both SOAP and REST implementations for all its web services. What trends is it seeing?
Barr on the debate
So what are the implications of Amazon's experience for the people on the standards committees?
We talk to those people pretty regularly, and they're very concerned that maybe SOAP isn't getting as much attention as it might, but they need to go out and talk to their communities. Not the folks that are simply imagining these protocols in their ivory tower, but actually talking to real developers.
One of the expectations is that as the services become more complicated, as we need to deal with authentication, as we need to deal with signed requests and so forth, we would hope (but I've been telling developers this for just on four years now) that some of the complexity of SOAP will actually start to be of value. The jury's still out on whether that's really true or not.
Is compatibility between different SOAP and WSDL implementations an issue?
Yes, that's an issue. The various tools, tool chains, and tool sets are of different levels of maturity. We have to test against several different consumers of WSDLs before we do a release to make sure we don't put something there that doesn't work with a particular tool. Compatibility is not a barrier to SOAP adoption. By the time developers see it we've already taken care of any issues.
The great thing about REST is that you can just demo it in the browser. Bring up the browser, you put a URL, you explain all the fields, you hit Enter, the XML comes back.
Given the number of different things that the web services community is asking of developers, I think that's really important, that really easy ramp, rather than that wall that you slam into as you're trying to come up to speed with WSDL and SOAP.
Undoubtedly, we could have spoken to folk at IBM, Microsoft or BEA and gotten a more positive account of WS-*, and as Barr stated, there is always the possibility that it may come into its own in complex, business-critical enterprise environments. Even so, Amazon's experience underlines the importance of accessible APIs and echoes O'Reilly's comments about bottom-up standards.
The evolution of the programmable web continues apace, but there’s intense debate over what the internet’s API should look like. Five years ago it looked as if SOAP was the answer, a W3C standard for exchanging data with XML and heavily sponsored by Microsoft and IBM. Support for SOAP and WSDL (Web Services Description Language) is built into Microsoft's .NET Framework. In the years since SOAP was first specified, its sponsors have been working on further web service specifications that build on SOAP to address issues such as encryption, authentication, guaranteed delivery, event handling, workflow, and transaction support, creating a large suite of protocols often summarised as WS-*. These are important issues, but WS-* has been widely criticised as over-complex. "Is the emperor dressed?" asked Sun's Tim Bray here  in 2004.
This is partly a political matter, but another notable fact is that the plethora of new API's associated with Web 2.0, such as Google Maps, have not generally used SOAP. Joshua Schachter, founder of the del.icio.us  web tagging site, recently stated  that "SOAP is insane".
Some advocate an alternative approach called REST (Representational State Transfer), in which every resource is addressed by its URI and operations are based on HTTP's GET and POST. It's also possible to use REST and SOAP together. ®