Facebook on Tuesday unveiled a series of previously unreleased rules detailing how the company monitors users’ posts on its platform in an effort to be more transparent about its operations.
The community standards, which are provided on the company’s website, details various words and images that the platform censors.
“You should, when you come to Facebook, understand where we draw these lines and what’s OK and what’s not OK,” Facebook Vice President of Product Policy and Counterterrorism Monika Bickert said, according to Reuters.
{mosads}
“Everybody should expect that these will be updated frequently,” she said.
Facebook has long had a “community standards” page on its website, but it was considerably less detailed than the rules released on Tuesday.
Some of the examples seen in the rules include Facebook not allowing videos of people who have been wounded by cannibalism, and not permitting attempts to buy and sell marijuana, prescription drugs and firearms on the website.
The new community standards do not address the company’s use of personal data and do not discuss how the platform will handle false information or “fake news,” according to Reuters. Both of these have been contentious issues for the company in the past.
Facebook has recently been grappling with backlash for how it handled the data breach scandal involving Cambridge Analytica.
The British research firm was able to improperly harvest the data of 87 million Facebook users.
Facebook founder and CEO Mark Zuckerberg was grilled on the issue by lawmakers on Capitol Hill earlier this month.