Feeds

Events, dear boy... a view from Apache

Who better to comment on open source quality issues?

5 things you didn’t know about cloud backup

Comment After last month's somewhat lightweight piece, I had in mind a solid programmer article for March. But like Harold Macmillan, I'm not just going to ignore events. So here instead is something topical.

Open source advocates have been getting very worked up about an article in The Economist entitled Open source business. Open, but not as usual. The journalist is, as usual, wrong, misinformed, spreading FUD. Sigh. Dear Editor, ...

Do the slashdotters and groklawyers have a point? Yes, certainly. But do they make it? Not really: it's quite hard to find the valid criticisms amid the noise. And who do they speak for? A proportion (probably rather small) of the open source community (whatever that is). And, not least, am I misrepresenting them? Guilty: of course I'm generalising!

Much of the criticism focuses on some "conventional wisdom" about open source: it's chaotic, uncontrolled, unaccountable. Another point that attracted fury was the apparent credibility given to the SCO lawsuits (why do articles like this never mention, for example, Timeline's successful multi-million lawsuits against Windows end-users)? But these criticisms are, in my view, misplaced: in fact the article proceeds to explain why this conventional wisdom isn't true of the significant open source projects such as Apache. Some genuine - and indeed fundamental - factual errors such as equating open source with the GPL have largely been missed, and are in any case peripheral to the article. This is certainly an advance on the days when an article in the mainstream press could be expected to confuse Free Software with the crap on a typical magazine cover disc.

In my view, a stronger criticism is the characterisation of Wikipedia's troubles as a failure of open source. It cannot be that, because it's not an open source project in the first place (open contents?). Alas, it's hard to give the article benefit of any doubt there. But then again, the article's very thesis is that concepts originating with open source software are being adopted in other fields of human activity. Wikipedia is indeed an example of the perils of inadequately thought through adoption of something that looks beguilingly like open source and Eric Raymond's "bazaar".

The Economist is usually well-informed, but at the level of the boardroom executive, not the backroom geek. The most interesting thing about the article is that it appeared at all: this is far more testament to the success of open source than any sound and fury from Microsoft and SCO, or from slashdot and groklaw.

So, how does Apache work in reality to ensure quality and accountability?

Well, any technology company will tell you its greatest asset is its people. Apache is no different, though the brand is worth something too! It has a major advantage over commercial companies: it's not scrabbling for vaguely-competent developers in a competitive marketplace. It's under no pressure to recruit, and it can afford to restrict itself to people with a proven track record of contributions. And its pool of prospective developers is worldwide, limited only when a developer - or his/her employer - is unwilling or unable to sign the relevant Contributor License Agreement (CLA), the document that guarantees Apache and its users the legal right to use the intellectual property in the developer's contributions.

So, Apache has the talent. How does it make best use of it?

Formally, the project is managed by the Project Management Committee (PMC), which reports to the ASF board. But although the PMC has a private mailing list and ultimate responsibility, all technical discussion happens openly and in public, with the possible exception of security issues. The primary resources are the subversion (formerly CVS) repository, and the developer mailinglist. Other important “fora” include the Bugzilla database, IRC chat, and the ApacheCon conference - the latter being the only time significant numbers of developers are likely to meet face-to-face.

The subversion repository is Apache's codebase, and encompasses all ASF projects - not just the webserver and associated APR library (which I, rather loosely, describe as "Apache"). The codebase comprises several parallel versions:

  • Trunk is the current development branch, containing all the latest developments. This is designated as Commit-Then-Review, so new ideas can get tested without bureaucracy and hassle.
  • Release branches (2.2, 2.0, 1.3) are designated stable, and are subject to much stricter rules. Any change to a stable branch must be reviewed and approved by at least three developers, and both source and binary back-compatibility is mandatory. This is coordinated through a file called STATUS in each branch, documenting proposed changes and developer comments. Normally patches to stable branches are tried in trunk first and backported.
  • Experimental branches may be created ad-hoc, to develop new ideas likely to cause significant disruption. A recent example is fully asynchronous client I/O. A successful experiment will be merged into trunk when it's stabilised.

A developer may also veto a proposed change. However, a veto must be accompanied by a detailed explanation and/or an alternative proposal. This then stimulates a discussion on the mailing list to resolve the issues. This works a good deal better than traditional management to impose checks and balances.

Apache's bugzilla database is also fully public and unmoderated, and can be chaotic. It combines real bugs, enhancement and feature requests with non-bugs entered by users, a few of whom refuse to let their pet 'bug' be closed. Some bugs may also be left permanently open to document features and provide patches, where some users may have specific needs that would not be appropriate to the standard Apache.

Intellectual Property is of course highly topical. Apache has robust measures in place to protect itself and its users from incorporating third-party IP, based on holding written consent from both individual and corporate contributors, and rejecting contributions where adequate guarantees are not available.

In summary, Apache's success is backed by strong organisation and effective processes. It is indeed disorganised in the sense that individual developers have much more freedom than in a commercial environment. That's nothing but good: a free developer is a motivated and productive developer, and peer review provides ample checks and balances - even leaving time for the tedious bits such as bugfixes! The bottom line - three times the market share of its nearest competitor. Next time you hear the argument that a market leader will be the product to suffer security exploits, just cite Apache's track record!

Secure remote control for conventional and virtual desktops

More from The Register

next story
BBC: We're going to slip CODING into kids' TV
Pureed-carrot-in-ice cream C++ surprise
China: You, Microsoft. Office-Windows 'compatibility'. You have 20 days to explain
Told to cough up more details as antitrust probe goes deeper
Linux turns 23 and Linus Torvalds celebrates as only he can
No, not with swearing, but by controlling the release cycle
Scratched PC-dispatch patch patched, hatched in batch rematch
Windows security update fixed after triggering blue screens (and screams) of death
Windows 7 settles as Windows XP use finally starts to slip … a bit
And at the back of the field, Windows 8.1 is sprinting away from Windows 8
This is how I set about making a fortune with my own startup
Would you leave your well-paid job to chase your dream?
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Endpoint data privacy in the cloud is easier than you think
Innovations in encryption and storage resolve issues of data privacy and key requirements for companies to look for in a solution.
Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?