This article is more than 1 year old

Euro Commission gives tech firms an hour to take down terror content

Three months to get your houses in order... or this time we really might legislate

The European Commission has given tech firms three months to set up systems that will allow them to take down terrorist content within an hour.

The Commission's missive is a recommendation and builds on a communication made last September – although it is technically a legal text in EU-ese, it isn't legally binding for the companies.

Rather, it sets out five operational measures for dealing with illegal content and five rules specifically for terrorist content for companies to follow. They will be assessed on this and, as ever, the Commission has the legislative stick close to hand.

Broadly, the rules push for increased use of automation – including controversial upload filters before content is published – along with greater clarity on notification systems, more cooperation with law enforcement and that big biz helps out smaller firms.

For terrorist content, the Commission made it clear that speed should be a priority, saying firms should have taken action "within one hour from its referral, as a general rule".

However, it has stopped short of introducing legislation – unlike the German government's much-derided NetzDG law, which threatens firms with €50m fines for a failure to take down in 24 hours.

The introduction of fines and short time limits have led to widespread concerns among businesses, activists and lawyers about the impact on freedom of expression.

At a press conference, vice president for the digital single market Andrus Ansip said that the German law was "not ideal". He added that with the threat of a fine, "of course the reaction of platforms is very simple. If they have some doubts... down, out, and so on", but contrasted this with the Commission's call for better safeguards.

"We proposed safeguards to protect freedom of speech, of expression. I think our recommendation is pretty balanced," he said.

These include human oversight, with a review step before content is removed, to ensure removal decisions are "accurate and well-founded" – especially when decisions are automated. It also calls for firms to give content posters the chance to contest such decisions.

Nonetheless, the Commission has once again left the door open for future legislation, saying it will assess actions taken off the back of the recommendation and decide if it is needed.

Companies and member states are given a deadline of three months for dealing with terror content and six for other illegal content.

The Commission has also made it clear that small firms will be covered by the recommendation, noting that they are increasingly likely to "become a soft target for illegal content online".

Commissioner Julian King said the body wants "the larger platforms to help the smaller platforms to avoid a migration of the content" to the smaller sites.

The British government's latest efforts to show it's doing something to tackle terrorist content online – the launch of an AI tool to detect Daesh content was also focused on smaller firms.

Such automated and proactive tools are a central part of the Commission's proposals – although it's widely acknowledged in the tech industry that creating accurate tools will be tough, especially as terrorist organisations are likely to adapt their content to get around them.

The recommendation also calls for more transparent and simpler rules for how people can notify firms of illegal or terrorist content, including fast-tracking alerts for trusted flaggers.

At the press conference, though, justice commissioner Vera Journova tried to empahsise that they weren't simply targeting the tech firms.

"It is not only a duty of IT companies to make the internet a safe place," she said. "Today's recommendation is also very clear on when the companies should notify prosectors... in cases of serious crime."

"And I expect the prosecutors to act," she said, adding that if they didn’t, they would be "complicit" in creating an atmosphere that encouraged and allowed hate speech to proliferate. ®

More about

TIP US OFF

Send us news


Other stories you might like