Tech giants should take the rap for enabling fake news, boffins tell EU

Yes, yeesss, give us lots of new things to do, say academics

Man shouting the news from a rolled up newspaper

Giant US tech platforms are spreading misinformation and deliberately hiding their algorithms to evade scrutiny, according to a report for the European Commission.

The expert working group on fake news and disinformation headed by law professor Dr Madeleine de Cock Buning warned there's no quick fix to the problem – and avoided quantifying how big the problem is. She noted the phrase "fake news" has been "appropriated and used misleadingly by powerful actors to dismiss coverage that is simply found disagreeable" – whoever that may be.

The professor defined the problem – but avoided attempting to quantify how much of a problem it might be. The commission cited a pan-EU survey from last month where 83 per cent thought misinformation was a threat to democracy, although on closer inspection that revealed enormous differences in perception across the 27 member states (more below).

Having warned against censorship and internet fragmentation, the professor made recommendations for areas of further probing.

These include that the systems that circulate news should be more transparent; the professional European news industry should be diverse and sustainable; there should be more media literacy and more research; and finally there should be tools "for empowering users and journalists to tackle disinformation and foster a positive engagement with fast-evolving information technologies" – such as labelling and curation tools for news feeds and search results.

Platforms "have also enabled the production and circulation of disinformation, on a larger scale than previously, often in new ways that are still poorly mapped and understood". They should make retracting stories easier and work on "developing new online tools allowing users to exercise their right to reply and to correct false stories" – heralding a new battleground in online flame wars.

European Commission graphic - Fake News Working Group

Graphic source: European Commission

Media literacy and research are two favourites of EU working groups, as they pave the way for more work. In 2013, before the phrase "fake news" had ever been uttered by a sitting US President, a similar call for greater media literacy went out. That working group called for EU-wide state regulation of the media, with "independent media councils" able to impose fines or "even remove journalistic status", the licence to print news. None of those calls are repeated in this report.

The working group recommended better labelling of sources and whether content is paid-for, and "information on payments to human influencers and use of robots" – obliging Macedonian troll farms to self-identify.

Media literacy "must be implanted on a massive scale in school curricula and in teacher training curricula, with clear methods of evaluation and cross-country comparison [why? – ed] and with reflection in educational rankings gauges". A pan-EU community of practice should be established.

As for revealing the algorithms, the working group floated the idea of open APIs "to monitor effects", as Twitter apparently does. But the report noted that fact-checking labels "can be counter-productive" and providing alternative links may be more effective. It envisaged a goal of "an open market for fact-checking" that avoids a "monopoly of truth" with a management board "composed of experts". That should do it.

Only 13 per cent of UK respondents think the EU should play a part in fixing the fake news problem. Most favour journalists and citizens sorting it out themselves. But as we've noted before, the idea that humans "can be engineered and corrected by instruction from their enlightened betters" (Daniel Bell) burns bright.


The Euro Barometer polling highlighted big differences across the EU. Trust in print varies from 33 per cent (Hungary) to 90 per cent (Finland); trust in online social networks and messaging apps isn't higher than 41 per cent (Portugal, Sweden) and 17 per cent in Germany (UK: 25 per cent).

More people are confident they can identify false news or misinformation than not, ranging from 98 per cent (Denmark) to 56 per cent and 55 per cent in Portugal and Spain respectively. ®

Sponsored: Becoming a Pragmatic Security Leader

Biting the hand that feeds IT © 1998–2019