Facebook的内容审核规则都是Careful and ShockingThe rules for dealing with violent and disturbing images often require moderators to ask whether they are 'newsworthy' or 'raise awareness.'
ByNina Zipkin•
There is no doubt that it takes a huge effort to moderate all the content that gets uploaded to Facebook. But over the past few months, the social giant has shown signs of strain.
Back in August, shortly after the company fired a team of human editors overseeing theTrendingsection of the site in favor of an algorithm, a false news story found its way to the top of the queue.
In February, CEO Mark Zuckerberg published a wide-rangingopen letteron his Facebook page about the direction he hopes to take the company, touching on the need for more vigilance in the face of "fake news" and also a stronger infrastructure to handle the raft of content that is posted by users on a daily basis.
Related:After Murder, Facebook to Hire 3,000 People to Review Videos
"There are billions of posts, comments and messages across our services each day, and since it's impossible to review all of them, we review content once it is reported to us," Zuckerberg wrote. "There have been terribly tragic events -- like suicides, some live streamed -- that perhaps could have been prevented if someone had realized what was happening and reported them sooner. There are cases of bullying and harassment every day, that our team must be alerted to before we can help out. These stories show we must find a way to do more."
This spring, after a murder in Cleveland was livestreamed on the platform, Zuckerbergannouncedthat over the course of the year, 3,000 people would be hired to better tackle and improve that review process.
But now, aninvestigationconducted bythe Guardianhas identified some of the standards that Facebook operates from when it comes to moderating content, and they are perhaps more confusing than you might expect.
Related:Facebook Wants to Help You Spot Bogus News Stories
With regard to the videos of violent deaths or suicides, they are designated as disturbing content, but the reasoning Facebook has for not necessarily taking them down is because they can build awareness about mental illness, according toThe Guardian'sfindings.
Specifically in cases of suicide,documentsthatThe Guardianhas been privy to explain that the current company dictate is "to remove them once there's no longer an opportunity to help the person. We also need to consider newsworthiness, and there may be particular moments or public events that are part of a broader public conversation that warrant leaving up."
When it comes toviolent language, a call to action to harm the president would be taken down because he is a head of state, but directions about how to snap a woman's neck would be allowed to remain on the site because it is not "regarded as credible threats."
Related:Facebook Pledges to 'Do Better' After Posting of Murder Video
For instances of animal abuse and graphic violence, those images and videos are also designated asdisturbing, but are allowed if they are being used to educate and raise awareness, but they are not if there is an element of "sadism and celebration." For images or photos pertaining to child abuse, that rule is also applied.
According tothe Guardian, moderators often havesecondsto make a determination about how to characterize or whether to remove the content.
It's clear that Zuckerberg and his team have a daunting task in front of them, so Facebook's rules will need to constantly evolve to meet the challenge.