After facing severe backlash over incorrectly censoring or removing images that it considered to be violating the Facebook Community Standards, the social media giant is making some big changes to its nudity and violence policy.
Earlier relying upon their strict Community Standards that censored or swiftly took down content that was considered objectionable, Facebook found itself being criticized for taking down one of the most iconic images from history, a photograph that changed the perception of millions all across the world towards the Vietnam war.
You can view the picture here.
Citing nudity as a violation of Facebook Community Standards that cover a wide range of parameters, the social media mammoth took down the image, only to restore it later after facing a lot of flak from many journalists and users.
“It is true that Facebook is a private company with a legal right to censor content. But as a global giant that claims in its mission statement “to give people the power to share and make the world more open and connected,” Facebook has an ethical responsibility to facilitate the free flow of information and ideas, especially news. Instead, Facebook is giving users a dangerously manipulated view of that world and contributing to the age of truthiness.” said an article published on Quartz, talking about the same.
This time around, Facebook has weighed their options, and come up with a decision to not be the judge, jury and executioner, rather letting the audience vote and choose as to what constitutes as objectionable content to them.
In a lengthy manifesto published by Mark Zuckerberg on Facebook yesterday, he announces that Facebook would be parting ways with their essentially blanket-ish set of rules regarding censorship of nudity and violent content, replacing it by giving the power to let users decide the parameters of what they are okay with.
An excerpt from the manifesto reads, “The idea is to give everyone in the community options for how they would like to set the content policy for themselves. Where is your line on nudity? On violence? On graphic content? On profanity? What you decide will be your personal settings. We will periodically ask you these questions to increase participation and so you don’t need to dig around to find them. For those who don’t make a decision, the default will be whatever the majority of people in your region selected, like a referendum. Of course you will always be free to update your personal settings anytime.
With a broader range of controls, content will only be taken down if it is more objectionable than the most permissive options allow.”
Facebook hopes to offer a much more personalized level of censorship with personal input from users, to all the users who proactively specify their opinions on the Community Standards they wish to see in place. Facebook hopes its stuttering prone AI will effectively learn to tailor personal preferences of users when it comes to censorship, something which it has had trouble with in the past.
By doing this, Facebook is removing the burden of accusations that it has found their platform under, that of deciding what people see on their News Feeds. To further improve on what they have started, the social network also wishes to expand and cover bullying and self harm, and also possibly include something similar to Instagram’s self harm and suicide prevention tools.