Why the New Force needs taming

R Seshasayee | Updated on January 18, 2021

The unfiltered information generated by social media needs regulation   -  REUTERS

Setting up a national and a global authority to fact-check and call out fake news on social media is the need of the hour

On January 6, Twitter struck the name of Donald Trump off its rolls. Trump, the President of United States, albeit a defeated President. For bad behaviour.

Twitter’s action (although highly opportunistic), conveyed to the world clearly, who has the power to call the shots now. It also triggered some ironical responses. Trump took to his Trump Team handle to tweet, unwittingly acknowledging the higher power of Twitter: ‘You can do without me, but I can’t do without you’.

The right wing in the US wailed about freedom of expression being throttled. In India, the ruling party was alarmed about the ‘abuse of power’!

Big Tech Social media is the New Force, that has the power to change the course of history.

It’s time to harness the New Force for the good of the people, and avoid the risk of it turning out to be a marauding monster. That requires an understanding of the nature of the beast.

Unfiltered, unverified

This New Force derives its extraordinary power from vox populi, the unfiltered, unverified and unrefined expressions of millions of people. When social platforms first hit the market, they were hailed as the ultimate in the democratisation of news. Except, as it turns out, public expressions are not necessarily true, or innocent. In fact, we might not even be listening to people’s voice, but to vicious algorithms. Jack Dorsey, Twitter’s Founder, thought, when the company was launched, that the platform would be used by people ‘for sharing inconsequential personal news’. That error in judgment has made him a multi billionaire.

The scale and power of social platform is a direct consequence of allowing unedited postings of often anonymous people. If postings had to be checked for veracity or assessed for adverse social impact, social media could not have grown this big. Contrast it with the much smaller size of print or electronic media, where there is intermediation in content delivery. Equally, regulatory insistence of ‘KYC’ of members would have made the social media ‘Dead on Arrival’.

It would be a complete blunder to place any constraint on the growth of social media or change its basic character. Yet, it is the unfiltered ‘news’ and anonymity that generates fake news and violent intent.

Secondly, digital media platforms have an identity problem. In the initial years, social media claimed that they were merely enablers of technology for people to interact, that they were agnostic to content . As such, they could be no more responsible for offensive content that pass through their platforms than a highway company for fatality caused by a rash driver.

Indeed, in the US, the law, popularly referred to as Sec 230, provides specific immunity to internet companies against offensive content carried by them. It also provides them sweeping powers to decide what they will carry and what they will not.

However, given the onslaught from law-makers across countries over the deluge of offensive and violent content, social media companies have since realised that playing the innocent bystander wouldn’t wash. Yet, they have been unwilling to commit to similar levels of accountability as the print and electronic media.

The task is therefore to reduce, if not eliminate, the toxic side-effects of fake news and violent intent and to establish accountability for responsible dissemination.

A case for accountability

Accountability, even in a diluted form, needs to come with authority to regulate content. While social media took tentative steps to curate content , they faced accusations of being biased. At a Congressional hearing in October 2020, Mark Zuckerberg summed it up: “Democrats often say that we don’t remove enough content, and Republicans often say we remove too much.” The fact that both sides are unhappy doesn’t mean that Facebook was getting it right.

Any attempt to place accountability through regulation has to deftly straddle freedom of speech and social responsibility at two ends of the pole, while recognising the unique character of social media as being neither a tollway nor a freeway. In India, the government brought up the question of regulation of digital media recently, in the course of its response to a Public Interest Litigation (PIL) before the Supreme Court, with the prayer to provide direction to establish a regulator for the electronic media.

The government rightly told the apex court that more than mainstream electronic and print media, the digital media required regulation, given its reach and influence, particularly because the most powerful players were foreign companies, with their own national and business interests.

Despite massive challenges, a regulatory framework, that would ensure untrammelled growth of media platforms, while excoriating untruth and offensive content, is both necessary and feasible.

A global compact

Considering that digital media is disdainful of national borders, the exercise has to begin simultaneously with a national authority as well as with efforts to put together a global compact of democratic nations, that would agree on a limited agenda with just four goals:

The Authority would

Develop, adopt and deploy fact checking algorithms to detect and call out fake news. While this can never be a comprehensive exercise, nor can it guarantee truth, it would serve as a major deterrent against ‘manufactured fake news’. This is not to deny the voluntary work already being carried by private institutions.

Detect and eliminate commonly agreed offensive content such as child pornography or incitement to violence, consistent with the laws of the country/member nations.

Set standards for industry to follow, with regard to the use and deployment of the above technology tools and processes.

Prescribe and enforce appropriate sanctions against errant players.

The authority should be more like the last frontier, with primary responsibility being mandated on the media companies.

While a regulatory body will ensure transfer of authority from big business to a public office on matters pertaining to truth and public good, there is a risk of potential manipulative powers in the hands of the government to shape public opinion in its favour. This concern is valid, but it would be no different from similar encroachments on other institutions of democracy.

The construct of democracy is not only made of periodic electoral choices, but also of independent institutions, such as Parliament, the Judiciary, the Media and the audit of public funds. Independent regulators are a sub-set of this institutional arrangement.

Ultimately, for a democracy to thrive, civil society also has to play its active role as sentinels to protect institutional independence and individual liberty.

The fear of the Big State cannot prevent taming of the Big Tech.


The writer is a Corporate Advisor

Published on January 18, 2021

Follow us on Telegram, Facebook, Twitter, Instagram, YouTube and Linkedin. You can also download our Android App or IOS App.

This article is closed for comments.
Please Email the Editor

You May Also Like