Revealed: Facebook, Google's soft-money 'blackmail' to stall Euro fake news crackdown
EU experts claim US tech giants use funding carrot to influence findings
Comment Facebook and Google used grants and other funding to academics and journalistic organizations to pressure a group of experts in Europe to water down proposals on fake news, it was claimed yesterday.
Despite being required to sign an agreement that included a confidentiality clause not to discuss what occurred, a number of eggheads broke ranks to report on what they called, in their own words, "blackmail" and "arm-wrestling" by representatives of Facebook in particular.
One member who has gone public – director-general of the consumer advocacy group BEUC, Monique Goyens – alleged that senior Facebook staff threatened to create problems if the group of experts pursued an effort to investigate whether the US tech giant was abusing its market power. "We were blackmailed," she bluntly summarized.
If this market abuse probe had continued, it would have encouraged the EU competition commissioner to examine whether Facebook and Google's business models had enabled the spread of fake news. Facebook didn't want that to happen, and according to another member who has remained anonymous, the biz considered pulling its financial support from organizations the experts were representing as a way of killing the idea.
That group member told Open Democracy the US corp "threatened that if we did not stop talking about competition tools, Facebook would stop its support for journalistic and academic projects."
Spokespeople for Google and Facebook declined to comment on Open Democracy's findings.
According to Open Democracy, at least 10 of the 39 members of the group worked for organizations that have been direct recipients on money from tech giants; several have received funding from both Google and Facebook. The Reuters Institute for the Study of Journalism, at the University of Oxford in England, for example has received €10m from Google. Other include the Poynter Institute and First Draft News.
According to Goyens, it was not made clear to the groups members that some of them had a conflict of interest. Another anonymous member said "it quickly became clear that [Google] had some allies at the table." Another noted that "there was heavy arm-wrestling in the corridors from the platforms" over what could and should be discussed.
What was the impact of this so-called "soft money"? All of the ideas put forward that would have forced the American tech giants to be more transparent about their business models and their decision-making were killed off. A planned vote to have the EU competition commissioner look at the impact of their market power never happened.
We applied to Google's €150m journalism fund – here's what we sent inREAD MORE
When the final report was published last year, it contained a host of well-meaning but vague and ultimately worthless recommendations. Just one example: "Develop tools for empowering users and journalists to tackle disinformation and foster a positive engagement with fast-evolving information technologies." Which it doesn't take a Rhodes Scholar to recognize means next to bugger all.
And in the ultimate signpost that a working group has made no useful progress, the final report recommended the European Union "promote continued research on the impact of disinformation in Europe…"
In other words, Facebook and Google successful prevented any serious investigation of their companies, activities, actions or business models that could have led to concrete changes to how they did things, or so it is claimed.
This is far from the first time that the use of so-called "soft money" has been used by tech giants to dissuade what should be independent reviewers from pursuing specific lines of inquiry or investigation. For years, academics have been alarmed at Google's funding of hundreds of researchers and academics and their organizations, and how it has used that influence to avoid serious scrutiny of its actions.
One of the more famous incidents occurred back in 2017 when academic Barry Lynn went public with his claim that his Open Markets program had been "spun off" from think-tank New America in direct response to a paper he had written that advocated fining Google for anti-competitive behavior.
New America's CEO Anne-Marie Slaughter received an angry phone call from Google chairman Eric Schmidt, according to Lynn, and shortly after that he was called into a meeting where he was told that was "imperiling the institution as a whole."
"The time has come for Open Markets and New America to part ways," Slaughter allegedly told Lynn.
Biggest Washington DC lobbyist is now a tech giant (yes, it's Google)READ MORE
After Lynn went public, Slaughter hit back, claiming that Lynn had "repeatedly violated the standards of honesty and good faith with his colleagues, including misleading me directly." Slaughter, it turns out, by the way, is a long-standing friend of Schmidt's but that did not factor into her decision, she claimed.
There are countless similar stories – although very few of them explode into the open. Instead, in academic circles, there are hushed tales and careful warnings. But when funding is a persistent issue for academics, it doesn't pay to piss off one of the few mega donors in your field.
Plus, of course, there is always a seemingly perfectly reasonable explanation for why Google or Facebook can no longer fund a specific program or institution. Google has become so expert at the shadowy practice that it doesn't even need to make implicit threats: organizations know that if they wish to get funded, then digging into an area likely to annoy the tech giant is not a great idea.
While it is telling that Facebook apparently felt the need to pressure experts to stay away from hot topics, it is only an indication that the antisocial network is new to the game. Give it a few years, and the soft-money influence will become as invisible as Google's has. ®
Updated to add
A spokeswoman for Facebook has been in touch to say this about Open Democracy's report:
This is a deliberate misrepresentation of a technical discussion about the best way to bring a cross-industry group together to address the issues around false news. We believe real progress has been made through the code of conduct process and we are looking forward to working with the European institutions to implement it.
Sponsored: What next after Netezza?