This article is more than 1 year old

UK Prime Minister calls on internet big beasts to 'auto-takedown' terror pages within 2 HOURS

And you thought 24 hours would be tough…

The UK's Prime Minister has once again raised the tech stakes in the fight against online terror, with her latest, er, bright idea being for internet giants to stop extremist content before it's even online.

At a meeting with companies including Facebook, Microsoft, Twitter and Google today, Theresa May urged them to "develop new technological solutions to prevent such content being uploaded in the first place".

Material from terrorist groups like ISIS is available on the internet for "too long" after being posted, May said at the meeting in New York, which coincided with the UN's General Assembly.

The German government recently passed a law that would see major social media firms fined up to €50m if they do not pull down or block criminal content within 24 hours.

But now the UK government – along with its French and Italian counterparts – are pushing for content to be removed within one to two hours of it going online. The governments are understood to be considering how to impose fines on companies if they don't meet the targets.

"Industry needs to go further and faster in automating the detection and removal of terrorist content online, and developing technological solutions which prevent it being uploaded in the first place," May said.

She added that tech-savvy terrorist groups will continue to up the ante, and so warned the firms that they need to constantly adapt if they are to stay one step ahead.

However, the technical ask here is huge, as has been pointed out pretty much every time governments make sweeping requests of technology.

As Brian Lord, former deputy director of intelligence at GCHQ, put it on Radio 4's Today programme this morning, it is easy to say "just get the technology to do it" – but there's a lot more to it in practice.

"You can use a sledgehammer to crack a nut and... just take a whole swathe of information off the internet because somewhere there will be the bad stuff we don't want people to see," he said.

But, Lord added, this will include a lot of valuable information that society, and governments, rely on, and so more refined techniques will be needed "and this is more challenging".

Lord also noted that, like the war on drugs, total removal is "never going to happen" – instead companies and governments can only make it more difficult for nefarious types to get access to, post and share terrorist content.

Jim Killock, executive director of the Open Rights Group, also urged caution over the plans for automated takedowns.

"Internet companies have a role to play in removing illegal content from their platforms but we need to recognise the limitations of relying on automated takedowns," Killock said.

"Mistakes will inevitably be made – by removing the wrong content and by missing extremist material.

"Given the global reach of these companies, automated takedowns will have a wide-reaching effect on the content we see, although not necessarily on the spread of extremist ideas as terrorists will switch to using other platforms."

He added that the move could have "wider implications", noting that the plans could also be used to "justify the actions of authoritarian regimes, such as China, Saudi Arabia and Iran, who want companies to remove content that they find disagreeable".

Nonetheless, internet companies need to show that they are listening to governments' concerns – especially given that May's comments come not even a week after a homemade bucket bomb was planted on a London tube.

Indeed, Google used the opportunity to announce it is launching an innovation fund to address hate and extremism, backed with some pocket change $5m.

A blogpost from senior veep for policy Kent Walker said that, over the next two years, it will invest the cash in "technology-driven solutions, as well as grassroots efforts like community youth projects that help build communities and promote resistance to radicalisation".

The first grant, he said, will be for $1.3m and be distributed by the UK's Institute for Strategic Dialogue.

More information on how to apply will be released in the coming months, with projects expected to come up with "innovative, effective and data-driven solutions that can undermine and overcome radicalisation propaganda". ®

More about

TIP US OFF

Send us news


Other stories you might like