How Facebook Decides If You See Nudity or Death (HBO)
You disliked this video. Thanks for the feedback!
Facebook employs 4,500 content moderators around the world. Moderators get two weeks of training and a stack of manuals to help them police the site for racism, misogyny, violence, and pornography.
VICE’s partners at The Guardian obtained more than a hundred of these manuals and they offer the first-ever look at the sometimes logical, sometimes inexplicable ways Facebook asks a few thousand people to help patrol its close to 2 billion users.
This segment is part of the May 23rd VICE News Tonight episode.
Watch VICE News Tonight on HBO Mondays through Thursdays at 7:30 PM ET.
Subscribe to VICE News here: http://bit.ly/Subscribe-to-VICE-News
Check out VICE News for more: http://vicenews.com
Follow VICE News here:
More videos from the VICE network: https://www.fb.com/vicevideo