Columns

Monitoring digital content

G Krishna Kumar | Updated on April 13, 2021

The mechanism must be transparent and unbiased

The government’s Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules 2021 sent shock-waves across the digital and OTT industry. And why not? This is the first time the government has undertaken any initiative towards regulating the hitherto unregulated digital media and OTT (over the top) platforms. It is a fine line between regulation and restriction and hence the government is offering repeated clarification that it is aiming for “soft touch” regulations

India’s Internet usage has been growing rapidly, doubling to over 70 crore users now compared to 2015, and is expected to touch 100 crore users by 2025. During the past three years, subscribers on the digital and OTT platforms have also grown rapidly.

Two-sided marketplace

Digital platforms (like Facebook, YouTube) and OTT players (like Amazon Prime, Netflix) are often called as intermediaries in a two-sided marketplace. In such a market, two sets of players interact through the intermediary or platform. In the case of, say, Netflix, the two sides would be the content creator (movie or documentary producer) on one side and the consumer who watches the content, on the other.

“Network effect” plays a major role in a two-sided marketplace. Essentially, more the number of subscribers on a platform, the better it is for the content providers. The content moderation guidelines need to balance the needs of the general public and the content providers.

Social media companies have, of late, been facing a trust deficit as issues related to data breaches, privacy, provocative posts, fake news, etc., have been reported across several countries. India banning Chinese apps for data breach and the subsequent surge in equivalent ‘Made in India’ apps/platforms is well known. This should serve as a warning to the global tech giants on India’s ability to act in case of non-compliance.

EU regulators are pushing for laws that would hold the intermediary companies directly responsible for dissemination of illegal content on their platforms.

The UK is seeking to hold the intermediary companies responsible for a predefined list of online harms including illegal content and harmful user behaviours. France requires companies to remove illegal content within 24 hours from receiving a notification.

Singapore’s digital content regulation by Infocomm Media Development Authority (IMDA) focusses on community standards while providing more choices for adults and protecting the young. IMDA believes in co-regulation as an effective mechanism.

Recently the Australian government started an inquiry into the role of global technology firms/platforms in spreading false information. Already, global tech firms have responded by launching a voluntary code to prevent spread of false information on their platforms.

India’s plan to trace the source or origin of harmful content is a good step as this will deter mischievous elements from spreading false or harmful content on social media platforms. This will also push the content providers on OTT platforms and OTT companies to abide by the guidelines.

However, considering the size of the digital user base in the country, the government must create the right framework to understand the challenges in implementation. Can anyone raise objection, and how will the system handle if there are thousands of complaints? The online platforms should provide clear information on their operational model and responsibilities.

India’s plan to establish a three-level grievance redress mechanism looks to be a good model. The grievance redress officer needs to acknowledge complaints within 24 hours and resolve them within 15 days. The government has defined a threshold of 50 lakh registered subscribers for an intermediary to be considered as “significant”. Such intermediaries are mandated additional compliance and reporting. Overall, it is still not clear how the whole model will be implemented.

The Information and Broadcasting Ministry will formulate an oversight mechanism. The government having all powers can be tricky, but then it depends on maturity of the overall ecosystem, including the government, in creating an unbiased complaint redress system. While the government is implementing content moderation, issues like consumer/data protection and consumer’s privacy must not be diluted.

Considering the complexities, it would take at least a year for the impact of the present regulations to be visible. With several stakeholders involved in the process, regular audits and reporting will help in strengthening the regulations and bring a practical, unbiased and transparent mechanism in the country.

The writer is an ICT professional and columnist based in Bengaluru. Views are personal

Published on April 13, 2021

Follow us on Telegram, Facebook, Twitter, Instagram, YouTube and Linkedin. You can also download our Android App or IOS App.

This article is closed for comments.
Please Email the Editor