Facebook has announced setting up an oversight board in a bid to self regulate content decisions. This comes after the social media giant was criticised for not doing enough to remove controversial content on its platform. Facebook has appointed 20 people from different walks of life across different countries in a bid to win back people’s trust by bringing in an independent body to oversee these content decisions. But there are still concerns over how this oversight board will function. Brent Harris, Director, Governance and Strategic Initiatives, Facebook responded to queries sent by BusinessLine over email.

Is the board designed for rapid response to viral issues?

No. The purpose of the board isn't necessarily to deal with rapid viral issues, but complex challenging content issues that have wide ranging impact. In exceptional circumstances - for example, when content could result in urgent real-world consequences - Facebook may send cases to the Board for an automatic and expedited review. The Board will review as quickly as possible. It is important to stress that even an expedited review will only ever take place after content has been posted and may still take several days. The Oversight Board does not exist to prevent or rapidly respond to these content issues in real time.

Related Stories
Facebook creates $130 million fund for new oversight board
One Indian among the first 20 members on the board that will make content decisions
 

Facebook has partnered with other social media companies in the Global Internet Forum to Counter Terrorism to create a specific rapid response mechanism for this type of scenario. The guidance the Oversight Board provides will hopefully mean that over time these “rapid” responses have a clearer basis in international human rights norms.

The Oversight Board may hear cases where posts have been wrongly removed as “terrorist content” after the fact – for example if a media report was taken down because it included graphic images of a newsworthy incident, as well as cases where content has wrongly been left up (e.g. a post praising a terrorist attack while encouraging future attacks). These are the types of issues we can expect to be called upon to address in the future.

While the Oversight Board’s decisions on a specific piece of content are binding, will Facebook apply it to similar existing pieces of content elsewhere on the platform?

The board can also recommend that Facebook enforce the decision across variations or reproductions of that original content posted by other users. In this case, Facebook will make best efforts to do as is technically and operationally feasible, but cannot guarantee action on all past or future copies of the original content.

For more on case implementation - See Bylaws Newsroom Post

Does the board have the power to amend laws governing it as and when it discovers gaps in the governance structure especially rules related to decision making and review?

The bylaws can be amended through the approval of the trustees, Facebook and a majority of the board. A key exception are the board’s own operating procedures as detailed in Article 1 of the bylaws, which the board can amend unilaterally through a majority of the board.

See Oversight Board Bylaws, Article 5

The charter can only be amended through majority approval by the trustees and board members and agreement with Facebook.

See Oversight Board Charter, Article 6

Can the board pick a case suo motto, if required?

The board will be ultimately responsible for choosing the cases it wants to hear from those appealed by a user or referred by Facebook.

The Board will set the prioritization criteria once it is up and running; deciding exact criteria for selecting a case will be one of the first things Members will take up.

How would the board deal with cases related to misinformation through advertisements

The Oversight Board was established to consider the most significant content issues facing the Facebook and Instagram communities, including on advertising and issues relating to the impact of social media on democracy.

These are some of the most complex issues facing society, and the Board is committed to addressing these in a careful way that reflects the important long-term implications that decisions in this area can have for online discourse and civic participation.

We are less than 6 months to the US presidential election and the infrastructure for the Board to consider user appeals is being finalized over the coming months. We expect the Board to be fully operational this autumn, so want to be realistic about the Board's capacity to make quick decisions on these issues or to hear a political ads case ahead of the election.

comment COMMENT NOW