UK PM Theresa May orders review of online abuse laws in suffrage centenary speech
And social media giants will be the ones to clean it up
Posted in Policy, 6th February 2018 11:17 GMT
UK Prime Minister Theresa May has ordered a review of British laws governing online communications in her latest shot at big tech firms.
In a speech today marking the centenary of at least some women gaining the right to vote in Blighty, May will say that public debate is "coarsening" with a lot of this troubling behaviour taking place online.
"As well as being places for empowering self-expression, online platforms can become places of intimidation and abuse," she will say.
"This squanders the opportunity new technology affords us to drive up political engagement."
May will argue that such abuse is disproportionately targeted at political candidates who are female, black, minority ethnic or LGBT, which damages equal representation in politics.
Placing the task of addressing such abuse firmly at the door of firms like Facebook and Google, May will say that social media companies "must now step up".
She will endorse recommendations made by the Committee on Standards in Public Life earlier this year, which called for laws to shift the liability of illegal content towards social media firms, and another chorus of the old favourite that they need to speed up content takedowns.
May has also tasked the Law Commission with reviewing legislation on offensive online communications to ensure it "is appropriate to meet the challenges posed by this new technology".
Number 10 said this will include asking "whether particular concepts need to be reconsidered in the light of technological change, for example whether the definition of who a 'sender' is needs to be updated".
To make sure firms are doing what she wants, May announced a set of measures to single out those not falling in line.
There will be another code of practice for biz to sign up to, which will define the "minimum expectations" of how social media firms should behave. It will include how to develop, enforce and review community guidelines and detail reporting mechanisms for abusive content online.
There will also be an annual transparency report to mark their homework – this should set out firms' policies and plans to deal with harmful content, along with data on how many complaints they get, how they are dealt with and how much content is taken down as a result.
These ideas aren't particularly new: proposals for new laws on online hate content have been widely discussed – and implemented – around the world. The European Union already has a code of conduct and has carried out an annual review of action against illegal online content since 2016.
But the PM has no doubt spotted the apparent success of Germany's NetzDG law – where firms can be slapped with a fine up to €50m for failing to whip down content – which resulted in Facebook setting up a 500-strong office of people to monitor complaints.
Strong statements suit the government, which needs to be seen by the voting public to be doing something (anything). The firms, meanwhile, continue to fall over themselves to be seen in a similar light – especially with the legislative stick in the background.
However, campaigners remain concerned about the chilling effect heavy-handed implementation might have on free speech as companies may be encouraged to treat any reported content as guilty to ensure they don't fall foul of new laws. ®