This article is more than 1 year old

As Facebook pushes yet more fake articles, one news editor tells Mark to get a grip – or Zuck off

Hard to hear when you're buried under piles of $$$

Analysis If ever there was an argument for why journalism is not only important but a profession requiring a broad set of specific skills, then Facebook's seeming inability to do news is it.

Despite the social media giant being hauled over the coals for several months for failing to limit – or even recognize – widespread abuse of its platform to spread fake news, the California corp failed yet again this week to differentiate real news and obvious nonsense.

On Wednesday, in Virginia, USA, an Amtrak train carrying Republican lawmakers crashed, killing one person and injuring several passengers. But if you checked out Facebook's curated news section, you would have found out that the "news" that it was no accident.

It was, of course, a "false flag" operation – isn’t everything these days? – carried out by anti-Trump activists who possibly had connections to Hillary Clinton.

In the "people are saying" section of its trending news section, bogus news stories pushing the above conspiracy theory were prominently displayed on the page covering the Amtrak crash.

That means that despite its best efforts, with the world watching, Facebook remains unable to identify real news from obvious falsehoods. Why? And does it matter?

Breaking (the) news

First, why is Facebook incapable of doing what journalists have been doing for decades and identify reliable – and unreliable – information?

The answer is simple: because the company is determined that it will extend its algorithmic prowess, and only that, in dealing with the huge swathes of content that its users post and produce. It is culturally incapable of imagining how it can use its billions of dollars in profit – $4.2bn in just three months we found out on Wednesday – to hire people that will resolve its fake news problem.

It tried it once, hiring a group of journalists to scour and curate news, and it didn't go well. Something about having a group of independently minded, authority-questioning individuals working for Facebook HQ didn't quite gel with the prevailing work culture.

How does Facebook figure out what news appears and what doesn't? The biz won't say exactly, but it's safe to assume it prefers stories that its users have linked to, particularly articles that are heavily shared and from news organizations that Facebook views as trustworthy.

It's all a black box, of course. Why? Because it doesn't want people to game the system.

The great game

But, of course, they do game the system, as an open letter to Facebook CEO Mark Zuckerberg from the editor of the San Francisco Chronicle – a newspaper covering the Bay Area where Zuck has a nice house – makes plain.

"We struggled along, trying to anticipate the seemingly capricious changes in your news-feed algorithm," wrote Audrey Cooper earlier this month. "We created new jobs in our newsrooms and tried to increase the number of people who signed up to follow our posts on Facebook. We were rewarded with increases in traffic to our websites, which we struggled to monetize."

She goes on: "We were successful in getting people to 'like' our news, and you started to notice. Studies show more than half of Americans use Facebook to get news. That traffic matters because we monetize it - it pays the reporters who hold the powerful accountable."

But, it wasn't just newspapers that have people dedicated to figuring out what was in the Facebook black box, as Cooper points out: "Just as we had learned how to market real news on Facebook, so too did more nefarious operations. It became clear foreign agents and domestic partisans used Facebook to spread fabricated news."

And it is this reason that algorithms will never work for news: because Facebook assumes its users have the skills and noses of news editors when they don't. People are fickle, prejudiced, easily swayed, and just as easily conned. Plus, of course, they are not getting paid to focus in on the truth. Sharing a post takes one tap and two seconds. What does it matter if it's not real?

Plus no matter what clever algorithms are put in place, if someone has a strong enough incentive they will figure out how to game the system. And foreign agents, driven partisans, and the owners of clickbait websites have that incentive in spades.

More about

TIP US OFF

Send us news


Other stories you might like