WNPR

Facebook Updates Community Standards, Expands Appeals Process

Apr 24, 2018
Originally published on April 24, 2018 12:14 pm

Facebook announced changes to its content review policy Tuesday, adding an appeals process for removed content and releasing the internal guidelines it relies on to make content determinations.

While the social media giant has listed a set of publicly available community standards for several years, the latest update includes the more detailed guidelines that content reviewers use internally when deciding whether to allow or remove posts.

The updated appeals process will allow posters of removed photos, videos or posts to contest determinations they feel were wrongly made. Previously, appeals of community standards determinations were allowed only when a Facebook page, group or profile was removed entirely.

Facebook has been hesitant to reveal details of its content review policy in the past. But the company says Tuesday's announcement is part of its promise to "do better" and be more transparent about how it decides what stays up and what gets taken down. The changes come just weeks after CEO Mark Zuckerberg was grilled on Capitol Hill about Facebook's alleged censorship of conservative viewpoints.

Zuckerberg was asked by multiple lawmakers during his marathon testimony about "Diamond and Silk," two pro-Trump commentators who claim Facebook intentionally limited their presence on the site because of their political views. Zuckerberg apologized, calling the situation an "enforcement error" on Facebook's part, but the controversy raised questions about what type of content Facebook restricts and how it makes those decisions. Diamond and Silk are themselves set to testify before Congress later this week.

Granular standards

The newly released standards are a stark departure from Facebook's prior guidance, which had been crafted to express the company's values and priorities in a way that did not overwhelm readers.

"We've always had a set of community standards that the public can see," Facebook Vice President Monika Bickert told NPR's Steve Inskeep, "but now we're actually explaining how we define those terms for our review teams and how we enforce those policies."

Those new explanations are nothing if not comprehensive. They detail dozens of reasons posts can be removed and read more like the product of a team of lawyers than the words of an upstart tech company. The standards outline methods for categorizing content and provide specific definitions for terms like "hate speech," "terrorism" and "threats of violence."

"People define those things in different ways," Bickert said, "and people who are using Facebook want to know how we define it and I think that's fair."

Some objectionable content is classified into tiers, with Facebook's response matching the severity of the violation. Other content is removed if it satisfies multiple conditions in a point-like system. A threat of violence, for example, can be deemed "credible" and removed if it provides a target and "two or more of the following: location, timing, method."

Still other standards target particular categories of offensive posts. Content that "promotes, encourages, coordinates, or provides instructions" for eating disorders or self harm are specifically mentioned. And under its "harassment" section, Facebook says it will not tolerate claims that survivors of traumatic events are "paid or employed to mislead people about their role in the event." Other standards prohibit advertising drugs, revealing the identities of undercover law enforcement officers, and depicting graphic violence.

Context matters

When it comes to judging content, though, context is crucial. Facebook has been criticized in the past for its blundering approach to community moderation. In 2016, for example, the company reversed its decision to remove a post containing the Pulitzer-winning "napalm girl" photo, which depicted a nude and burned child in the Vietnam War.

Bickert says that example proves that exceptions are needed for newsworthy and culturally significant content.

Facebook's updated standards now list some exceptions for depictions of adult nudity, including "acts of protests," "breast-feeding" and "post-mastectomy scarring."

Still, questions remain over Facebook's content moderation program. Despite Zuckerberg's stated desire to use artificial intelligence to flag offensive content, the process remains very human. According to Bickert, the company has over 7,500 moderators who are stationed around the globe and work 24/7.

But conversations with those moderators paint a much bleaker image of Facebook's processes than the one Bickert provides. In 2016, NPR's Aarti Shahani detailed a workforce comprised primarily of subcontractors who are stationed in distant countries and asked to review large quantities of posts every shift.

It's not hard to imagine how someone located thousands of miles away, who grew up in a different culture, and who is under immense pressure to review as many posts as possible, might mess up.

The appeal of appeals

Facebook is seeking to address that problem with its new appeals system. Now, if your post is removed for "nudity, sexual activity, hate speech, or violence" you will be presented with a chance to request a review.

Facebook promises that appeals will be reviewed within 24 hours by its Community Operations team. But it remains unclear what relationship the team has with Facebook and with its first-line reviewers. If appeals are reviewed under the same conditions that initial content decisions are made, the process may be nothing more than an empty gesture.

Facebook points out that the content review and appeals process is just one way to clean up your experience on the site. Users have the ability to unilaterally block, unfollow, or hide posts or posters they don't want to see.

For the social media giant, it's a question of balance. Balance between free speech and user safety. Balance between curbing "fake news" and encouraging open political discourse. And balance between Facebook's obligation to serve as a steward of a welcoming environment and the realities of running a for-profit, publicly owned corporation.

"We do try to allow as much speech as possible," Bickert said, "and we know sometimes that might make people uncomfortable."

Facebook says that Tuesday's announcements are just one step in a continuous process of improvement and adjustment to its standards and policies. How much of an improvement this step represents remains to be seen.

Copyright 2018 NPR. To see more, visit http://www.npr.org/.

STEVE INSKEEP, HOST:

Facebook says it will be more open about the posts it takes down. The company tells NPR that today it is publishing internal details of its community standards. That's the term for what's allowed on Facebook and what is not. Monika Bickert is a Facebook vice president.

MONIKA BICKERT: So we've always had a set of community standards that the public can see that explain, for instance, no harassment, no bullying, no terror propaganda. But now we're actually explaining how we define those terms for our review teams and how we enforce those policies.

INSKEEP: She says users want more openness, which is an understatement. The company is under unprecedented pressure. It's been roiled by two years of questions - which news did it promote during the last election? How widely did it share users' data? - and more. Now it is revealing definitions used by internal monitors who check up on complaints about posts around the world, like, what exactly constitutes a genuine death threat? If it names a person, location or weapon, that should come down. Or what exactly amounts to hate speech?

BICKERT: Where we have drawn the line is that we will allow attacks or negative commentary about institutions or countries or religions, but we don't allow attacks against people. So if somebody is criticizing or attacking all members of a religion, that's where we would draw the line.

INSKEEP: I wonder if one of the gray areas there might be someone who criticizes Islam but in an extreme way that somebody might argue is inciting people against Muslims.

BICKERT: We do try to allow as much speech as possible about institutions, religions, countries, and we know sometimes that might make people uncomfortable. That's one of the reasons we give people a lot of choice and control over what they see on Facebook. You can unfollow pages, you can unfollow people, and you can block people that you don't want to communicate with.

INSKEEP: How are you thinking about the environment as the 2018 election approaches and, of course, there will once again be lots of political speech on Facebook?

BICKERT: Well, we know there are a lot of very serious issues, and it's important to get them right. We're focused on combating fake news. We're also focused on providing increased transparency into political advertisements and pages that have political content. And we're also investing a lot in our technical tools that help keep inauthentic accounts off the site.

INSKEEP: Are you already going after fake accounts in that larger, more specific way in the United States here in 2018?

BICKERT: Yes. The tools that we have developed to more effectively catch fake accounts - they've improved a lot, and we are using them globally. We now are able to stop more than a million fake accounts at the time of creation every day.

INSKEEP: The publication of its internal standards is another signal that Facebook is having to acknowledge that it is effectively a publisher. It wants to find itself as a technology company, just a platform for other people's speech, but the founder, Mark Zuckerberg, now accepts some responsibility for what is posted. Facebook was embarrassed when a famous old Vietnam War photo was mistakenly censored and then put back up. It's also had to tussle with authoritarian governments like Russia and Turkey that demand some posts be taken down. Just last weekend, Sri Lankan officials complained to The New York Times that Facebook was not responsive enough to complaints of hate speech. Monika Bickert says that when pressured by governments, the company at least tries to keep up speech that meets its standards.

What does this announcement suggest about the power your company has?

BICKERT: I think what it suggests is that we really want to respond to what the community wants. What we're hearing is that they want more clarity, and they want to know how we enforce these rules. That's why we're doing this. And we're actually hopeful that this is going to spark a conversation.

INSKEEP: But this is also a reminder, you've got this enormous fire hose of speech, maybe the world's largest fire hose of speech, and you can turn that fire hose on or off. It's your choice.

BICKERT: I want to be very clear that when we make these policies, we don't do it in a vacuum. This is not my team sitting in a room in California saying, these will be the policies. Every time we adjust a policy, we have external input from experts around the world.

INSKEEP: The company that claims some 2 billion users around the world insists it is straining to work within the laws of every country while still allowing as much speech as it can. Transcript provided by NPR, Copyright NPR.