The tragedy of the Creative Commons
Bogus tags sour the cybernetic dream
The Creative Commons initiative fulfilled a major ambition last week - but it's taken only days for the dream to turn to crap.
Google granted the wish by integrating the ability to search images based on rights licences into Google Image Search. Yahoo! Image Search has had a separate image search facility for years, but Google integrated the feature into its main index.
The idea of making the licences machine-readable was a long-standing desire of the project, and lauded as a clever one. It was intended to automate the business of negotiating permissions for using material, so machine would instead negotiate with machine, in a kind of cybernetic utopia. Alas, it hasn't quite worked out.
As Daryl Lang at professional photography website PDN writes, the search engine is now choked with copyright images that have been incorrectly labelled with Creative Commons licences. These include world-famous images by photographers including Bert Stern and Steve McCurry. As a result, the search feature is all but useless.
Since there's no guarantee that the licence really allows you to use the photo as claimed, then the publisher (amateur or professional) must still perform the due diligence they had to anyway. So it's safer (and quicker) not to use it at all.
What's gone wrong, as Lang explains, is the old engineering principle of GIGO, or Garbage In, Garbage Out. "The system relies on Internet users to properly identify the status of the images they publish," he writes. "Unfortunately, many don't... Many Flickr users still don't understand the concept of a Creative Commons licence, or don't care.
"It's time consuming to put a different label on every image [in their collection], and there are no checks in place [our emphasis] to hold users accountable for unauthorized copying or incorrect licensing labels."
So Google won't take responsibility for the accuracy of the licensing metadata, and Creative Commons, as a small private internet quango, says it can't afford to. (The disclaimer on the website is simple: go find yourself a lawyer.)
Just as we predicted, in fact: the filtering is less than perfect, and it's a lip-service to creators.
Now, why did it have to fail?
Sponsored: Today’s most dangerous security threats