Leaked Facebook docs reveal self-harm allowed by network's moderators

Leaked Facebook docs reveal self-harm allowed by network's moderators

Leaked Facebook docs reveal self-harm allowed by network's moderators

These rules were revealed through a recent Guardian investigation.

For comments that seem cruel or insensitive, moderators can recommend a "cruelty checkpoint"; this involves a message being sent to the person who posted it asking them to consider taking it down. What constitutes as revenge porn has been codified, threats of violence have a classification system - credible and generic/non-credible - and it investigates up to 6.5 million accounts a week to determine whether they are fake, malicious accounts. The company gave significant leeway to certain types of violent content, like self-harm and threats, while relying on "newsworthiness" to decide if videos and livestreams of suicide and terrorism should be removed.

Videos of "non-sexual" child abuse were permissible, marked as disturbing content, and provided there was no sadism or celebration.

In one document, Facebook said users felt safe to use violent language to express frustration.

Videos of abortions are ok as long as there's no nudity.

"Keeping people on Facebook safe is the most important thing we do", Facebook's head of global policy management Monika Bickert said in a statement to Recode.

Amber Heard Brings Much-Needed Color to DC in 'Aquaman'
The movie just started filming on May 17 in Australia, and obviously, it's already at a great start. Director James Wan who helms the DC Comics film posted the photo on his Twitter account last night.

Remarks such as "Someone shoot Trump" should be deleted, because as a head of state he is in a protected category. It reported some disturbing findings about what can and can't be moderated on Facebook, after the newspaper was passed more than 100 internal training manuals that included spreadsheets and flowcharts on how the Mark Zuckerberg-run company deals with hate speech, violence, self-harm, and a whole range of other issues. They revealed the company's efforts toward offering a platform where people can engage in free speech and at the same time avoid harmful content. According to the documents, direct threats of violence against Donald Trump will be removed ("someone shoot Trump"), but misogynistic instructions for harming women may not be ("to snap a bitch's neck, make sure to apply all your pressure to the middle of her throat").

Videos of deaths don't always have to be deleted because they can raise awareness of issues such as mental illness. Such videos are instead marked as disturbing and hidden from minors.

"These files demonstrate why powerful social media companies, including Facebook, have to be more transparent as the Home Affairs Select Committee recommended", she said.

The site now makes allowances for "newsworthy exceptions". like the famous Vietnam War photo of a naked young girl hit by napalm, and for "handmade art".

It's a bit depressing how hard it is to tease out the rationale behind these guidelines: "I'm going to kill you" is not a credible threat because it's abstract, but they very specific "unless you stop bitching I'll have to cut your tongue out" still works.

"Facebook can not keep control of its content", one source told The Guardian.

Related news