Facebook relies on a series PowerPoint slides that contain information that are either out of date or flawed and frequently use Google Translate, to moderate content, according to The New York Times.
Describing Indian laws incorrectly, mistakenly taking down comments judgmental of a religion in India, a Bosnian war criminal still being described as a fugitive and in Myanmar, a paperwork error allowing an extremist group, to stay on the platform for months are few examples of the discrepancies in those slides.
More than 7500 Facebook moderators using "PowerPoint Slides" to review posts by an estimate of more than 2 billion people is funny enough but those slides being inaccurate is the real joke. The company confirmed the authenticity of the documents, as stated by the Times.
Accuracy of Google translate is not reliable at all. If you've ever tried to use it, it translates individual words and not the whole sentence. The sentence may lose it's actual meaning in the course and you won't know the context of the statement.
Also Read: Facebook made shady deals with Apple, Amazon, Microsoft, Netflix & more to share users’ data
Moderators face loads of pressure while reviewing posts. They have to approve or reject posts in a matter of seconds, while abiding by the guidelines.
Moderators expressed frustrations at rules that don't make sense all the time, according to The New York Times. Certain violent keywords maybe used in a non-violent statement, such statements are puzzled, while being reviewed in a few seconds.
Facebook as a platform has been prey to fake news, abusive content, illicit activities and more. Such activities may ignite conflicts, political, religious or otherwise.
There's no denying the fact that reviewing numerous posts, given the massive user base is a demanding task but given the resources and technology available at Facebook, which they are currently using to invade their users' privacy, probably can be used to come up with a rational system.