Govt cracks down on ‘Blue Whale’ game

Updated - January 09, 2018 at 06:38 PM.

Google, Facebook, WhatsApp asked to remove links of such games from their platforms

In the era of information, messaging or games becoming a global rage within seconds, especially among the burgeoning number of young internet users in countries such as India, the government has taken serious note of a “suicide” game — Blue Whale Challenge’ — which has led to many youngsters killing themselves, with reportedly a few cases in India, too.

The Blue Whale challenge is said to be a “suicide game” in which the player is given certain tasks to complete within 50 days.

The final task leads to suicide, and a player is even asked to share photos after finishing the challenge.

Directive to tech firms

Concerned over the issue, the Ministry of Electronics and Information Technology (MeitY) has written to major technology firms, including Facebook India, Google India, Instagram, Microsoft India, WhatsApp and Yahoo! India, to remove links of such games from their platforms.

“You are hereby required to ensure that any such link of this deadly game in its own name or any similar game is immediately removed form your platform. The proponent of this game should be reported to the law enforcement agencies,” the Ministry said in the letter.

“It is understood that an administrator of the game uses social media platform to invite/ incite children to play this game, which may eventually lead the child to take extreme steps for self-inflicting injuries, including suicide,” the letter added.

However, the question that arises is how these app developers or platforms can control or protect children from harming themselves.

When asked, some companies said they do not allow promotion of such self-injurious or suicidal games and remove these as and when these are reported. “We care about the safety of our community and want to provide assistance for people in distress. As outlined in our Community Standards, we don’t allow the promotion of self-injury or suicide and will remove it when reported to us,” said a Facebook spokesperson.

The spokesperson also said the social media platform provides people who have expressed suicidal thoughts, and people who want to reach out to a friend who may be struggling, with a number of support options and resources.

“These tools and resources were developed with the help of over 70 mental health partners around the world, including several in India, and we’re continuously improving them to build a safer and more supportive community on Facebook,” the spokesperson added.

Google’s guidelines

Google also has similar guidelines and as per its Playstore guidelines, it doesn’t allow apps that depict or facilitate gratuitous violence or other dangerous activities. This includes apps with instructions on how to commit suicide or those that promote self-harm.

When asked, a Microsoft India spokesperson said: “We are looking into MeitY’s request as a matter of priority.”

Published on August 15, 2017 17:13