Meta Oversight Board Issues Highly Critical Review of Cross-Check Program

Content from politicians, business partners, celebrities, others remained on Facebook, Instagram for several days when it would have otherwise been removed quickly

Mark your calendar for Mediaweek, October 29-30 in New York City. We’ll unpack the biggest shifts shaping the future of media—from tv to retail media to tech—and how marketers can prep to stay ahead. Register with early-bird rates before sale ends!

All Facebook and Instagram users are created equally, but some are more equal than others.

The Meta Oversight Board issued its recommendations Tuesday on the company’s controversial cross-check program, and Meta vice president of global affairs Nick Clegg said in an updated Newsroom post that a response would come within 90 days.

The Meta Oversight Board said in a statement, “For years, cross-check allowed content from a select group of politicians, business partners, celebrities and others to remain on Facebook and Instagram for several days when it would have otherwise been removed quickly. Meta told the board that, on average, it can take more than five days to reach a decision on content from users on its cross-check lists. Based on the information provided to the board, Meta has taken as long as seven months to reach a final decision on a piece of content subject to cross-check.”

The independent group said it reviewed thousands of pages of internal documents, asked the company 74 questions, received four briefings from the company, held four regional workshops, received 87 public comments from organizations and individuals and spoke with experts including the source of the report in The Wall Street Journal that shed light on cross-check, former Facebook product manager and whistle-blower Frances Haugen.

“While Meta did offer some transparency as part of our review of cross-check, its responses were at times insufficient, and its own understanding of the practical implications of the cross-check program was lacking,” the Oversight Board said. “The company has room to improve, and its request to the board for an independent review of this program was an important step in that process.”

The Oversight Board outlined aspects of cross-check that ran counter to the company’s responsibility to identify and mitigate negative human rights impacts or uphold its company values, including:

  • A broad scope to serve multiple and contradictory objectives that enables visibility and virality for violating content.
  • Unequal access to discretionary policies and enforcement.
  • Program enrollment may exceed reviewer capacity.
  • Failure to track core metrics to assess the program, ascertain whether it is meeting objectives and make improvements.
  • Lack of transparency and auditability about its functioning.

The independent body issued 32 individual recommendations, which included:

  • Prioritize affording secondary review to users who are likely to produce expression that is important for human rights.
  • Publish clear criteria for eligibility for secondary review and allow eligible users to apply to be added to review lists.
  • Publicly mark the accounts of certain entities protected by secondary review, such as state actors, political candidates and business partners.
  • When content identified as violating during Meta’s first assessment is classified as high severity, it should be removed or hidden while further review is taking place.

Clegg wrote in the updated Newsroom post, “We built the cross-check system to prevent potential over-enforcement (when we take action on content or accounts that don’t actually violate our policies) and to double-check cases where there could be a higher risk for a mistake or when the potential impact of a mistake is especially severe. For example, when journalists are reporting from conflict zones.”

He said that over the past year, Meta developed principles and governance criteria to ensure that cross-check is run in a more consistent way, and the company developed a new cross-check ranker covering content from all users and using algorithms to assess situations where a “false positive” action may have been taken on content.

Clegg added that the company also: formalized a set of objective criteria for adding users and entities to cross-check; reduced the number of employees who can add them; developed a process for removing users; and established annual reviews “to make sure that we periodically verify our lists of users and entities that received secondary reviews, as well as the requirements for including them in the first place.”