Facebook, on Wednesday, announced setting up an oversight board that will decide on what content should be allowed or removed on its platform, based on respect for freedom of expression and human rights. The board comprises of independent people from different parts of the world with expertise in freedom of expression, digital rights, religious freedom, content moderation, online safety, internet censorship, platform transparency, and civil rights. Facebook has established a $130 million trust to funds all operations related to this board.

Sudhir Krishnaswamy, vice chancellor of the National Law School of India University and who co-founded an advocacy organization that works to advance constitutional values for everyone, including LGBTQ+ and transgender persons, is the only Indian among the first 20 people appointed to this board. Alan Rusbridger, former editor-in-chief of The Guardian; Helle Thorning-Schmidt, former Prime Minister of Denmark; András Sajó, former judge and vice president of the European Court of Human Rights; Tawakkol Karman - A Nobel Peace Prize laureate who used her voice to promote non-violent change in Yemen during the Arab Spring; and Afia Asantewaa Asare-Kyei - a human rights advocate who works on women’s rights across Africa, are among those who have been appointed on the board.

The move comes after Facebook received severe backlash over how it handled content on its platform in the past. There have been instances where the social media platform allowed propaganda content that interfered with the outcome of elections in countries like the US; and posts that fuelled communal unrest in some part of the world.

Currently, Facebook has about 30,000 employees on its content moderation team. There are specialised teams that enforce against coordinated inauthentic behaviour - influence operations - aimed at manipulating users. These teams don't remove individual pieces of content under our Coordinated Inauthentic Behavior (CIB) policies. Instead, Facebook takes down the pages or accounts connected to a network engaging in deceptive behavior. The new board will focus on decisions relating to individual pieces of content to start, and will review the removal of accounts or pages once those object types are technically available for appeals.

The new board will be another layer aimed at transparency and independent decision making. Members of the oversight board, are not Facebook employees and cannot be removed by Facebook, although they will paid from the funds provided by the social media company. All decisions will be made public, and Facebook will respond publicly to them.

Industry experts said that though the move is a step in the right direction, some of the rules governing the board could be a matter of concern. For example, the board has been given 90 days to review a piece of content. Activists believe that 90 days is too long because on digital platforms, questionable content can go viral within hours. Responding to Business Line’s query, Brent Harris, Director, Governance and Strategic Initiatives, Facebook said, “The purpose of the board isn't necessarily to deal with rapid viral issues, but complex challenging content issues that have wide ranging impact. In exceptional circumstances - for example, when content could result in urgent real-world consequences - Facebook may send cases to the Board for an automatic and expedited review. The Board will review as quickly as possible.”

When asked if the oversight board will also look into misinformation through advertisements, especially from political parties, Harris said “ We are less than 6 months to the US presidential election and the infrastructure for the Board to consider user appeals is being finalized over the coming months. We expect the Board to be fully operational this autumn, so want to be realistic about the Board's capacity to make quick decisions on these issues or to hear a political ads case ahead of the election.”