Facebook’s long-awaited oversight board announced Thursday it is now accepting cases. The board, first announced by the social media behemoth in 2018, is meant to serve as an independent check on Facebook’s moderation decisions.
The board is composed of independent members from around the world, who will make final and binding decisions on what content Facebook and Instagram should allow or remove based on respect for freedom of expression and human rights. “Our focus has been on building an institution that is not just about reacting to a single movement or chasing a specific news cycle, but about protecting human rights and free expression over the long term,” administrative director Thomas Hughes said in a Thursday call with reporters.
The members of the 40-person board were introduced in May, and they include a former prime minister, a Nobel Peace Prize winner, and the Guardian editor who oversaw the publication of the Snowden leaks. Each board member will serve a three-year term, and Facebook put $130 million in an irrevocable trust to fund its operations. Crucially, Facebook has promised it will not interfere with the board’s decision-making.
For now, the board will only hear cases concerning content that was removed by Facebook. Individual users can bring appeals to the board, and Facebook as a company will be able to refer cases for expedited review if they could have urgent, real-world consequences. The board has sole discretion about whether to accept or reject cases referred by Facebook.
Brent Harris, Facebook’s director of governance and global affairs, said on the call that the company would not submit any cases for expedited review before the US presidential election on November 3rd.
Users won’t be able to flag third-party content that Facebook has decided to leave on the platform, at least not yet. Columbia Law professor Jamal Greene, a board co-chair, said this functionality will be added “in the coming months.”
Board members will take turns in a rotation on a case selection committee, which will evaluate and pick cases for a review by a majority vote of the committee. Each case will be assigned to a five-member panel, which will include at least one member from the region of the content under review. The board will decide if the content violates Facebook’s community standards and values, and if it conforms with international human rights norms and standards.
“We can’t hear every appeal, simply because the volume that will be submitted is too high, but we want our decisions to be influential and have impact beyond the single case,” Greene said.
Helle Thorning-Schmidt, former prime minister of Denmark and another co-chair of the oversight board, said case decisions will be published and archived on the board’s website, presenting information used to make its decisions. Facebook must implement the board’s decisions, unless there’s a legal obligation to block access to content, Thorning-Schmidt said, and Facebook will disclose any actions it takes.
“We will of course hold the company accountable to their commitment,” she added.
The board also decided it would implement a public comment period before it begins deliberation on a case, to allow third parties to share insights and perspectives. Users will be able to sign up to receive alerts when new cases are posted to the website and open for public comment.
While the board has met several times over Zoom to build up its procedures, Greene said it has not met to discuss substantive issues or what cases it would review. “It is conceivable, now that we’ve launched, that we will have substantive conversations and be looking for particular kinds of issues on which to weigh in,” Greene said. Thorning-Schmidt added that the board will set specific criteria for how it will select cases when they start coming in.
Facebook executives are meant to have no influence over the independent operation of the board, although its creation was the result of significant efforts by the company. In an interview with The Information, Mark Zuckerberg said he hopes to expand independent governance of Facebook if the oversight board is successful.
“Assuming the model works as planned, I hope to either expand its role or add other formal governance to more aspects of our content policies and enforcement over time,” he told the publication. “I’m optimistic about this and I think it’s very important that we create more independent governance here.”
To be eligible for review by the board, a user first has to appeal the initial content decision by Facebook or Instagram and have received a final decision. The person filing the appeal has to have an active account on the platform where the content was posted (Facebook or Instagram), and users have to submit an appeal within 15 days of Facebook or Instagram’s final decision.
The board will make its decision within 90 days of accepting a case for review.
“We know, and we have said many times before, that we will not be able to solve all the problems regarding content on Facebook,” Thorning-Schmidt said. “The oversight board wasn’t created to be a quick fire or an all-encompassing solution, but to offer a critical independent check on Facebook’s approach to moderating some of the most significant content issues.”