Porn, abuse, depravity - and how they plan to stop it
Part one: Strangling content
Policing the Internet Contrary to popular belief, the government and police forces have hitherto not exerted a great deal of direct control over content. But, after a decade of growth in self-regulation and filtering by the industry to avoid government intervention, that may be about to change.
Current UK law on content is a mish-mash. The first and main stop-off must be the Obscene Publications Act 1959 (pdf), which made it an offence to publish material likely to "deprave and corrupt". This has been followed over the years by various laws brought in to deal with specific media and moral panics.
In the late 1950s, for example, a law was passed to regulate Children's Cartoons - and then used just once in the next 50 years. And in 1984, panic over "video nasties" led to the Video Recordings Act (pdf), which has been used rather more frequently.
Child protection has been a favourite wedge for politicians, and this has introduced a new principle to UK law over the last couple of decades, making possession of specific images an offence. Recently, this principle has been extended from child abuse into the areas of porn and terror.
We are not immune...
The first sign that the net wasn't entirely above the law came in 2000 with settlement of the Demon/Godfrey case. Dr Laurence Godfrey had sued Demon for hosting a libel in the form of a Usenet message, and Demon settled out of court for £15,000 plus £250,000 expenses. This was a wake-up call to ISPs, who suddenly grasped that - in the UK at least - they might bear some responsibility for the content they hosted.
The late 90s also saw the emergence of the one body most of us associate with internet policing - the Internet Watch Foundation (IWF).
As public concern about the availability of paedophilic material on the net grew, one of the UK's more influential net figures, Peter Dawe (then CEO at Pipex), realised that unless ISPs took action they would be forced to do so by government. He was therefore a key figure in the creation of the IWF, which was founded in 1996 by the ISPA, the trade body for Internet Service Providers, and other leading players in the internet.
The IWF model probably comes closest to what we might think of as an internet policeman. It runs a hotline through which the public and IT professionals may report any inadvertent exposure to potentially illegal content online.
The IWF then investigates, and its staff liaise closely with the police in order to align their standards with what the law says. The organisation then has two possible avenues of response. If a site deemed to be hosting illegal content is found to be operating within the UK, the IWF issues a take-down notice, which usually results in that site coming down within 24 hours. For content hosted outside the UK, the IWF sends details to the relevant authorities for investigation.
But it also maintains a list of potentially illegal URLs, updated twice daily, and this is provided to all major UK ISPs. At any one time, the list contains between 800 and 1,200 live child abuse URLs, with around 50 added daily.
In theory, owners of websites blocked or taken down by the IWF have a right of appeal. But according to the IWF this circumstance has never arisen.
The IWF's remit is slightly peculiar. Its focus - and what constitutes the vast majority of its work - is child abuse hosted anywhere in the world. But it also polices UK-hosted sites whose content is deemed to be criminally obscene or incites racial hatred (solely - no other form of hatred is covered).
Is there any question mark over where the IWF might next extend its empire? Talking to The Register, the IWF suggested that the clampdown on extreme porn was a refinement of its criminal obscenity remit, and that it was engaged in consulting with its board and the online industry regarding its potential future role.
But this takes the organisation into greyer territory. Its work on child abuse is clearly defined, almost beyond controversy, and has gained the IWF a reputation as an agency for good. It is censorship, certainly, but it's not the sort of censorship that attracts much criticism.
Extreme porn might be a harder pill to swallow. Definitions are less clear, and the existence of several organised campaigns on this subject suggests it may be a much hotter political potato.
The ad hoc and self-regulatory nature of internet policing may be seen in the miscellany of other organisations that in one way or another are regulating content. The IWF is a self-regulatory body with charitable status working closely with government, and government agencies such as the police. But it is not of the government.
Then there is the Advertising Standards Authority (ASA), a non-statutory alliance of agencies, media owners and advertisers which keeps an eye on advertising content both on- and off-line. Concern about online claims that go beyond simple advertising has led to the formation of the Digital Media Group, which will be concerned with misleading claims and pricing.
On the content side, the Press Complaints Commission regulates press content through its code of practice, and already covers newspaper websites in its remit. It recently added audio-visual content, and there are suggestions that it would like to extend its brief to web-only news sites.
Films are regulated via the British Board for Film Classification (BBFC), which earlier this year extended its Black Card classification system to online films. That may in time prove critical in the light of new legislation on extreme porn: a controversial film with black card attached will be safe to own. Identical material without black card may attract a criminal prosecution.
Then there is Ofcom, a statutory organisation charged with upholding the broadcasting code, and dealing with complaints about possible breaches. Rumours abound that it might like a more formal internet remit - government has frequently mooted such a role. But the official line appears to be 'no thanks'.
The one organisation that does not appear to play much of a role in formal policing of net content is, strange but true, the police. But police will respond to specific complaints, and in the last few years, two notorious cases have demonstrated how the police can and will put pressure on websites for reasons of public order.
First was thinkofthechildren.co.uk, a spoof website that reacted to perceived paranoia over paedophiles by outlining the etiquette of forming a lynch mob. This was taken down, initially, after the Met had unofficial words with the host and suggested that the site "might" be construed as inciting violence. The site owner also claimed that the IWF had been complicit in this action - a claim that the IWF rejects.
Embarrassingly for the police, the site went straight back up, with a tart rejoinder that if the police thought any offence was being committed, they should deal with it through the proper channels. It still exists, mirrored here.
A similar incident arose in 2002 over the "Parking Clowns" website, which was the response of some residents in Canterbury to what they considered over-officious parking enforcement. The website included photos of parking attendants and this, the Police advised, together with the generally inflammatory nature of the site, could be deemed to be harassment amd again, possibly lead to violence.
"Parking Clowns" is no more. But these incidents seem few and far between - and despite assiduous attempts to elicit details of the police policing content, the Reg was unable to find much.
So is internet content largely safe? Have we reached the point where we can sit back and assume that what we have is what we get to keep?
Back in January, Home Secretary Jacqui Smith appeared keen to hitch the War on Terror to the IWF bandwagon by claiming that there were specific examples of websites that "clearly fall under the category of gratifying terrorism", and talking about how terrorists "groomed" potential recruits.
A significant development in this war on terror material is that the Home Office has since declared itself to be working with partners in an effort to "take down" terror material. The problem is, there is no such thing - individuals may be prosecuted for assembling material with the intention of using it to support terrorism. In theory, however, the material remained neutral. Not any more.
Similarly, in September the Ministry of Justice announced a forthcoming review of the Suicide Law, observing: "UK ISPs already take down any websites under their control when notified that they contain illegal material and are free to restrict access to harmful or tasteless material in accordance with their 'acceptable use' policies."
For YouTube and Google, "standards" are the way in which the once unbridled freedom of the internet is being corralled. Thus, in Germany, Google is being pushed to block all foreign porn websites that don't conform to German mandated age verification schemes. Expect more of the same to come.
Meanwhile, YouTube has bowed to US government pressure, updating community guidelines to encourage users to steer clear of posting material that might be considered to incite violence.
The scariest is yet to come. Forget the Scottish Parliament, calling for an end to all violent images of women. Or Home Office Minister Vernon Coaker pushing for all UK ISPs to run a blocking system akin to that created for the IWF.
The child protection lobby again appears to be acting as a Trojan Horse for far greater censorship. The Byron Review (pdf) reported to general government approval earlier this year, and one of its first fruits, to be launched this autumn, will be the UK Council for Child Internet Safety (UKCCIS). Threaded through its role of making the internet "safer for children" will be a remit to bring forward new regulation, or suggest legislation where appropriate, to control online content.
Another serious policy extension may be found within the first releases from the UKCCIS. While Byron spoke about working to prevent children accessing content that was "inappropriate to children", the UKCCIS has very quickly pushed that out to talk of blocking "inappropriate content". The first aim is about monitoring people: the second is very much about dumbing down - or infantilising - the internet.
Hard on the heels of Byron came the 10th report of the Select Committee on Culture, Media and Sport, looking at "Harmful Content on the Internet". It too believes that there is just too much nasty content out there - although at time of writing, it is in two minds as to whether the right way forward is more law or greater industry self-regulation.
A government response to its recommendations is due to be published when parliament returns in October. Anyone looking for a return to the good old days of internet free-for-all would be well advised not to hold their breath. They are gone - and aren't coming back. ®