Chevron icon It indicates an expandable section or menu, or sometimes previous / next navigation options. HOMEPAGE

Facebook's leaked moderation rules show the company desperately needs to be more transparent

Facebook thinks that the message "to snap a b---h's neck make sure to apply all your pressure to the middle of her throat" can be permissible. Likewise, it won't take down video livestreams of people self-harming. Images of animal abuse? Also okay.

We know all this because The Guardian has got its hands on the manuals and internal documents used to trained Facebook's hidden army of moderators, who police the platform for material that fall foul of its community standards.

mark zuckerberg facebook ceo eyes hair
Facebook CEO Mark Zuckerberg.
Justin Sullivan/Getty Images

Facebook's standards for moderation have previously attracted heavy criticism. Take the time it censored iconic Vietnam War photo "The Terror of War," and censured Aftenposten, Norway's biggest newspaper, for publishing it. Or its banning of a Renaissance-era Italian statue for being "sexually explicit." Or the time it suspended users who posted a photo of Aboriginal women in traditional dress.

Facebook has extraordinary, unparalleled power to shape the flow of information in the world today. The "community standards" it sets control how more than a billion people communicate, while its opaque algorithms decide what is noteworthy and deserving of human attention and amplification.

And it does all this with almost no oversight, beholden only to its shareholders — and its CEO Mark Zuckerberg, who holds a controlling stake in the company.

The publication of Facebook's internal moderation rules is welcome — but it's scandalous that this is the only way users, lawmakers, and journalists will get to see them. Facebook does publish public "community standards" outlining what is and isn't okay on the social network. But previous moderation scandals have only highlighted their shortfalls, without shedding light on why these failings keep happening.

That's the real value of The Guardian's leak. They provide clarity on how one of the world's most powerful companies — one with the power to shape public debate and set social norms, unlike any other — really operates.

In a statement, Facebook's head of global policy management Monika Bickert said: "Keeping people on Facebook safe is the most important thing we do. We work hard to make Facebook as safe as possible while enabling free speech. This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously. Mark Zuckerberg recently announced that over the next year, we'll be adding 3,000 people to our community operations team around the world — on top of the 4,500 we have today — to review the millions of reports we get every week, and improve the process for doing it quickly."

vietnam war nick ut napalm girl photo pullitzer
'The Terror of War,' one of the most iconic photos of the twentieth century.
AP Photo/Nick Ut

Over the last year or so, people have increasingly woken up to the power that Facebook wields. The social network has been accused of suppressing conservative news by former moderators. Fake news has propagated on its platform, spreading misinformation, and with some critics accusing it of having a hand in the election of Donald Trump.

More recently, Wired reports, the company boasted to advertisers of its ability to target young users when they were feeling "insecure," "anxious," like a "failure," and during other times of psychological crisis.

"We have now arrived at the point where Facebook, by controlling what they show to more than 1 billion people every day, has aggregated so much editorial power, that Zuckerberg must acknowledge his responsibility and take part in the discussion," Aftenposten editor Espen Egil Hansen wrote in a column for The Guardian last year. "The alternative, a continued passive approach to this debate, will be bad for democracy, bad for the conversations our communities rest on, and maybe even bad for Facebook themselves in the long term."

In short: Facebook is not just another app maker or tech company. Zuckerberg may insist that he doesn't intend to run for president, but he sounds more and more like a politician every day, publishing an epic near-6,000-word manifesto earlier in February 2017 about his intentions to build a "global community."

If Facebook wants to live up to that awesome responsibility, it needs to commit to proactively releasing far more guidance on what it allows and how it handles its users' data — and if not, governments should compel it to do so.

This column does not necessarily reflect the opinion of Insider.

Facebook Op-Ed Mark Zuckerberg

Jump to

  1. Main content
  2. Search
  3. Account