Moderation of content on social media platforms — Facebook, Twitter, Instagram, YouTube and other Google products — ought to be a core activity of major companies and not outsourced to third-party operators in abysmal, low-paid work conditions, says a report by New York University Stern Center for Business and Human Rights.

The report, “Who Moderates the Social Media Giants? A Call to End Outsourcing”, highlighted that the reason for incendiary online content, especially in developing countries, which has often led to violence and conflict is that the critical function of content moderation has been outsourced to lowly paid, overworked and mentally “second class” employees hired by third-party contractors.

According to the report, 15,000 workers, the overwhelming majority of them employed by third-party vendors, police Facebook’s main platform and its Instagram subsidiary. About 10,000 people scrutinise YouTube and other Google products. Twitter, a much smaller company, has about 1,500 moderators. The report said this number of moderators is inadequate in relation to the overwhelming volume of content that they have to examine.

Employees vs AI systems

“These numbers may sound substantial, but they’re woefully inadequate,” said the report highlighting that enormous volume that these moderators have to assess in terms of billions of posts and uploads that appear on Facebook, Twitter, and YouTube every day. On Facebook alone, more than three million items are reported on a daily basis by Artificial Intelligence (AI) systems and users as potentially warranting removal.

The report said this degree of volume is a direct result of Facebook’s aggressive business strategy of pursuing global user growth in an effort to please investors and advertisers that are the company’s paying customers. However, unlike other global companies, there is not adequate accountability in relation to the company’s global growth. The report recommended that moderators be treated as full-time employees of the social media giants as they perform a critical core function.

“Increased moderation needs to be accompanied by the presence of a country director and policy staff members in each country where Facebook operates. Responsible global companies have people on the ground where they do business. A social media platform should be no different. Facebook, YouTube, and Twitter should have offices in every country where users can access their sites,” it added.

The sensitive nature of the task of content moderation was revealed during the Covid-19 pandemic when thousands of long-distance moderators or reviewers were sent home for social distancing, but Facebook, YouTube and Twitter did not want content review to take place remotely and unsupervised.

Content review and Covid

Consequently, all the three social media companies announced in mid-March that they would temporarily reduce their reliance on human moderation and shift more of the content review burden to their AI-driven technology.

Facebook head Mark Zuckerberg conceded that the pandemic-induced reliance on techonology would lead to mistakes. The company’s algorithms inevitably would “take down some content that was not supposed to be taken down”. The Stern report said Zuckerberg decided to enlist the company’s full-time employees to handle and review “the most sensitive type of content” which is the way forward in content moderation. “…Human involvement should come from people who are full-time employees,” said the report.

comment COMMENT NOW