Facebook’s Oversight Board can become a module for self-regulation: Sudhir Krishnaswamy

Hemai Sheth Updated - May 08, 2020 at 02:22 PM.

Sudhir Krishnaswamy, vice-chancellor, National Law School of India University

Facebook on Wednesday announced the setting up of its Oversight Board with 20 members from across the globe.

The board comprises members from all over the world who have expertise in freedom of expression, digital rights, religious freedom, content moderation, online safety, internet censorship, platform transparency, and civil rights.

Sudhir Krishnaswamy, vice-chancellor of the National Law School of India University, is the only Indian among the first 20 people to be appointed to this board.

In an interview with

BusinessLine, Krishnaswamy discusses the ultimate goal for the board, its strategy to navigate the grey areas in content moderation and the future of this content moderation model. Edited excerpts:

The board will review complaints on a case-by-case basis. How will these cases be prioritised?

The board as a whole will develop a policy to determine what cases will come to it. It can come through three channels. They could be referred by Facebook, referred by users; the board might even have a policy where they could take cases on their own. That is yet to be decided. But the channels are all open. The board will put out a policy on how cases will be decided and what cases will be taken. The board will be handled by the current content policy administration That is only a small issue related to user complaints.

Currently, the board’s task is to review cases that are only related to the content that is taken down. What about issues related to potentially harmful content that is up on the platform?

Right now, the board is only reviewing cases related to the content being taken down due to a technology issue. It will be sorted out quickly. Once it is resolved, all decisions, whether it is about user content that needs to be taken down or to keep up will be appealable. Going forward, all decision will be appealable.

According to the bylaws, the purpose of the Oversight Board is to protect freedom of expression. There has been a broader criticism calling for the board to include a wider scope of human rights. Do you think the board should broaden the scope to more issues that need to be covered?

I don’t see it like that. The board will review issues arising out of content. So if it is a content-related issue, it is potentially appealable to the board. The reason would be either you wanted to it to stay up or go down The bases or grounds of your complaint might be something else, ultimately someone will want the matter to either stay up or taken down. All issues related to content will come under the board.

Can the board refuse to review a case? If yes, will that decision be made public?

We will set up a policy through which we identify what are the challenging cases, and only those cases will be taken up by the board. There may be some cases that users may appeal that we may find is adequately covered in the current interpretation of the company. Once we start the functioning of the board, all decisions will be transparent and deliberated and will be made public.

In certain cases, content that may seem offensive to a particular entity might not be offensive to others. How will the board navigate this subjectivity?

The board will consider difficult cases. The board will have guidance in terms of content policy and international norms, which will be taken into consideration. The board will work in panels. These panel discussions will then go up to the full board. These will be deliberated, collective decisions.

So if a case comes to us, background policy, broader norms that will apply to the cases will be considered. The best you can do in any decision-making process is to be articulate and make a collective decision based on the norms and the information that you have.

The board will have to make a decision in 90 days. Isn’t that time period quite long in today’s digital world? Do you think this time period could be shorter?

Every day, there are thousands of decisions being made on the platform about content that can stay up or taken down. All of that is going to continue. The cases that are going to come to the board will have a bit of novelty and or some difficult questions, that have not been addressed. So that way, the board will take some more time to make that particular decision. It has an effect on the future consideration of cases of that sort. That is the benefit. Ultimately, we can make decisions really fast. But, the motivation here is to slow down a bit and make deliberate decisions that we can work with over a longer period of time.

There are exceptions to information that the board can access. For instance, information such as direct messages on Messenger, Instagram and information from Oculus is not accessible. Do you think this will hinder the review process in any way?

We will not have access to some information. Some, due to technical issues; some due to privacy reasons. There will be more than enough information to make a sensible decision. Access to information is not a concern at the moment.

The board is compensated through a trust fund set up by Facebook. Won’t there be a conflict of interest?

If you look around there are many such entities that are set up, independent foundations; and these foundations operate on their own at a distance from the company. In that sense, the structure of the board is independent. We are neither governed by Facebook, nor are we responsible to Facebook, nor can Facebook interfere in our decisions. Facebook does not engage with us.

The board can make independent decisions for specific cases that it will review. However, currently, it is up to Facebook to decide whether or not will these decisions be implemented on a broader, policy level. How can the board navigate through this grey area?

The board will make independent decisions in cases. But the board will also publish an annual report where we review all the decisions and their implementations, which will be in a public document. So, Facebook will be continuously accountable to the board. Over a period of time, we will be given the task of policy decisions as well. That is not our primary task.

Will the process of appealing to the board be separate for individual users and institutional complaints?

None of that has been decided. All of that will be spelt out. The board has not been able to meet given the circumstances. We are anticipating by September we should be at work. All these documents should be prepared and the work should be done. The documents have not been made. The first job of the board will be to make them.

What is the way ahead for the Oversight Board and this method of content moderation as a whole?

This problem regarding how to go about content moderation has been around for over 10-15 years If this works, this might become a module for other platforms to implement self-regulation, and that is an exciting possibility. Currently, this applies to Facebook and Instagram. If this model works, it can be modelled across the platforms for other entities which are also having similar issues.

Published on May 8, 2020 08:51