Feeds

OAuth 2.0 standard editor quits, takes name off spec

Says the protocol is enterprise-grade rubbish

Business security measures using SSL

The lead author and editor of the OAuth 2.0 network authorization standard has stepped down from his role, withdrawn his name from the specification, and quit the working group, describing the current version of the spec as "the biggest professional disappointment of my career."

Eran Hammer, who helped create the OAuth 1.0 spec, has been editing the evolving 2.0 spec for the last three years. He resigned from his role in June but only went public with his reasons in a blog post on Thursday.

"At the end, I reached the conclusion that OAuth 2.0 is a bad protocol," Hammer writes. "WS-* bad. It is bad enough that I no longer want to be associated with it."

OAuth is an authorization protocol that allows users to share private resources stored on one site with applications running on another site, without handing out their usernames and passwords.

Its best-known proponent is Facebook, which has implemented a draft version of the OAuth 2.0 spec as part of its Open Graph set of social APIs. Other high-profile sites that have implemented OAuth to some degree include Google, Microsoft, Twitter, and Yahoo!

But according to Hammer, none of these implementations is likely to be interoperable with any of the others, because the OAuth 2.0 specification has grown too broad and it allows for almost unlimited extensibility.

"It is this extensibility and required flexibility that destroyed the protocol," Hammer writes. "With very little effort, pretty much anything can be called OAuth 2.0 compliant."

The problem, in Hammer's view, is that the OAuth 2.0 working group has catered far too much to the needs of the enterprise world, at the expense of important security features that are necessary if the protocol is to be used on the web.

Authorization tokens in OAuth 2.0 are inherently less secure than they were in OAuth 1.0, he says, as a direct result of a series of compromises that were made to address the demands of the enterprise community.

Even worse, Hammer says, the working group has been unable to reach a consensus on a long line of significant issues, resulting in a specification that fails to deliver on even its most basic goals and doesn't achieve anything more than OAuth 1.0 did.

"I honestly don't know what use cases OAuth 2.0 is trying to solve any more," Hammer says.

Hammer believes the eventual breakdown of the OAuth specification effort was the direct result of its becoming a working group under the Internet Engineering Task Force (IETF) in 2009, which he now feels was "a huge mistake." The IETF, he believes, is institutionally incapable of producing a simple protocol that serves the needs of the web community, like OAuth 1.0.

Following Hammer's post, the broader OAuth community chimed in to agree with many of his points.

"I can't decide if I should feel guilty for dropping out immediately after IETF San Francisco, or if I should feel grateful I didn't waste any time on the OAuth 2.0 fight," writes Mark Atwood in a comment on Hammer's original post.

Others disagreed with Hammer's assertion that OAuth 2.0 was a failure, and said that the problems with the standardization process were more organizational.

"I've built client libraries for both OAuth 1.0 and 2.0 and I can tell you hands down that OAuth 2.0 is much easier to implement than OAuth 1.0," writes Joe Gregorio in a post on Google+. He adds, "The IETF process isn't really broken, but it really only works with good working group chairs in place."

What Hammer's departure will mean for the OAuth 2.0 standard remains to be seen. But Hammer himself is not optimistic.

"I think the OAuth brand is in decline," he writes. "This framework will live for a while, and given the lack of alternatives, it will gain widespread adoption. But we are also likely to see major security failures in the next couple of years and the slow but steady devaluation of the brand. It will be another hated protocol you are stuck with." ®

New hybrid storage solutions

More from The Register

next story
'Windows 9' LEAK: Microsoft's playing catchup with Linux
Multiple desktops and live tiles in restored Start button star in new vids
New 'Cosmos' browser surfs the net by TXT alone
No data plan? No WiFi? No worries ... except sluggish download speed
iOS 8 release: WebGL now runs everywhere. Hurrah for 3D graphics!
HTML 5's pretty neat ... when your browser supports it
Mathematica hits the Web
Wolfram embraces the cloud, promies private cloud cut of its number-cruncher
NHS grows a NoSQL backbone and rips out its Oracle Spine
Open source? In the government? Ha ha! What, wait ...?
Google extends app refund window to two hours
You now have 120 minutes to finish that game instead of 15
Intel: Hey, enterprises, drop everything and DO HADOOP
Big Data analytics projected to run on more servers than any other app
SUSE Linux owner Attachmate gobbled by Micro Focus for $2.3bn
Merger will lead to mainframe and COBOL powerhouse
iOS 8 Healthkit gets a bug SO Apple KILLS it. That's real healthcare!
Not fit for purpose on day of launch, says Cupertino
prev story

Whitepapers

Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Security and trust: The backbone of doing business over the internet
Explores the current state of website security and the contributions Symantec is making to help organizations protect critical data and build trust with customers.