No nudity please, we're killing ourselves: Advice to Facebook mods leaks
What's allowed and what isn't
A large quantity of Facebook's advice for its English-language content moderators* has been obtained and published by The Guardian newspaper, giving insight into how it handles material on the site.
Facebook, which earned $27bn last year, has 4,500 human moderators and recently said it would add another 3,000. They decide whether material posted by the network's two billion users stays online or not.
Protected by a law, Section 230 of the Communications Decency Act, which in the United States gives any intermediary a powerful legal shield against liability, it isn't surprising to find Facebook leans on the side of permissiveness.
Videos of violent deaths "can help create awareness", but must be marked disturbing. So too is footage of non-sexual child abuse, unless it is shared "with sadism and celebration".
Facebook doesn't by default block people live streaming self-harm, which run into thousands a week, because it "doesn't want to censor or punish people in distress who are attempting suicide", but will take the video down if "there's no longer an opportunity to help the person".
Videos of abortions are not forbidden, except when the subject is nude. It's the nudity that's the no-no:
Documents from Facebook indicate a rise in self-harm being streamed with more than 5,000 reported incidents over a two-week period. The company does react to media scrutiny, and was criticised for live streaming a Chicago man's suicide as well as leaving the video of a Thai user's infant murder online for 24 hours.
Moderators are told to consider whether an incident was "newsworthy".
You can find more here. ®
*All material cited is in English. Moderation guidelines for other countries will be required to follow local laws.
Sponsored: Becoming a Pragmatic Security Leader