This article is more than 1 year old

Facebook grows a conscience, admits it corroded democracy

Mr Zuckerberg, are we the baddies?

Always looking for an excuse

What it proposes still falls far short of what media organizations are legally required to do during election seasons. And even while acknowledging the need to be more in line with rules drawn up decades ago, Facebook is already giving itself an out.

"As critical as this plan is, it poses challenges," Chakrabarti wrote. "How, for example, do we avoid putting legitimate activity at risk? Many human rights organizations commonly use Facebook to spread educational messages around the world. The wrong kind of transparency could put these activists in real danger in many countries."

Yep, there's nothing worse than the wrong kind of transparency.

The same approach is taken with Facebook's other fundamental problems, which Chakrabarti lists as:

  • Fake news
  • Echo chambers
  • Political Harassment
  • Unequal Participation

In each case, Chakrabarti recognized the problem honestly, noted what others have been proposing most loudly from outside of the company – and then explained briefly why the company isn't going to do it.

In every case, it is not Facebook but everyone else that is responsible for fixing things. On fake news: "The best deterrent will ultimately be a discerning public." On echo chambers – where Facebook's algorithms feed people the same information – it is us, not Facebook who's to blame: "The deeper question is how people respond when they encounter these differing opinions - do they listen to them, ignore them, or even block them?"

Not our fault

When it comes to harassment, again, it is not Facebook but "governments themselves" who are responsible. "Even in more open societies, we’re seeing cases where government officials write hateful posts that make enforcing our Community Standards challenging," Chakrabarti complains.

The solution to stopping it is – guess what? – you again. "Policing this content at a global scale is an open research problem since it is hard for machines to understand the cultural nuances of political intimidation."

Toward the end, Chakrabarti tried his own version of Vint Cerf's famous observation that the internet is a mirror of society. "If you stand in front of a mirror and you don't like what you see, it does not help to fix the mirror," the man frequently credited as having invented the internet said more than a decade ago.

Except Cerf – who is Google's "chief internet evangelist" – was talking about the entire internet, not an entirely self-contained universe controlled by algorithms that requires you to login to visit, and which funds itself from the sale of users' information. A platform that just happens to be online.

"If there's one fundamental truth about social media's impact on democracy it's that it amplifies human intent - both good and bad," philosophized Chakrabarti. "At its best, it allows us to express ourselves and take action. At its worst, it allows people to spread misinformation and corrode democracy."

Pirates

Pathologically incapable of reflecting too long on life's darker side, however, he quickly reverted to Silicon Valley sunshine: "What gives me hope is that the same ingenuity that helped make social media an incredible way to connect with friends can also be applied to making it an effective way to connect with the public square. In the end, that’s why I believe that a more connected world can be a more democratic one."

The Mitchell and Webb Nazi sketch came in two parts. In the second part, Erich continues pushing the skull concern, with Hans arguing back.

Erich: "What do skulls make you think of? Death, cannibals, beheading, pirates..." Hans brightens up: "Pirates are fun!"

Facebook is at the "pirates are fun" stage when it comes to its own platform. ®

More about

TIP US OFF

Send us news


Other stories you might like