In early August, Solapur in southeast Maharashtra was gripped by a strange fear. Like most small towns, Solapur rarely makes the headlines except when drought deepens. That changed as alarmed villagers in almost all of the district’s 11 tehsils camped outdoors day and night, on the look out for an unseen enemy. “In the bastis (villages), residents kept night vigils, sitting around a fire,” Deepak Homkar, a local journalist, recalls. Rumours were flying thick — of theft, widespread looting and possible kidnapping of children. And all of it over WhatsApp, the instant messaging app. Similar scenes were reported from Ahmedabad a month earlier. Rumours of dacoity and terrorist attacks spread panic in areas around Ahmedabad. Arrests were made of those who had allegedly sent fear-mongering texts, but the damage had already been done.
With over 800 million, and growing, active users worldwide, WhatsApp is popular among the 160 million smartphone users in India too. Neatly slotting lives into groups of friends, work and family, it allows users to flit in and out of interactions. But a fair amount of trouble-making is also springing up from the app.
In Solapur, the police tasked with putting the rumours to an end had visited bastis , narrowed down the suspected smartphone users and randomly checked their WhatsApp messages. “We found these rumours on some, not other [phones],” says a police official. After 36 hours of search, 16 young men were held under IPC 505 1(B) for spreading alarm and fear in Pandharpur tehsil alone. It included those who allegedly sent the message and several ‘admins’ (those who open and manage the group accounts). After being questioned and warned against repeating such texts, the men were let off.
The complicity of the WhatsApp admin, whether as a passive onlooker or whether they forwarded the messages themselves, remains hazy. “It is possible that some admins may have forwarded the text, but I spoke to at least one who was held only because he managed the group,” says Homkar.
Are the admins culpable in such situations? “Not at all,” say internet experts, if the admin has had nothing to do with the fear-mongering texts. But if the admin has forwarded a potentially harmful message, he/she is accountable like anyone else. “It is then an act done knowingly,” says Prasanth Sugathan, counsel at the Delhi-based Software Freedom Law Centre. “The act of forwarding makes you accountable. The burden of truth is on you,” he adds. Chinmayi Arun, research director at Centre for Communication Governance, National Law University, Delhi, pitches in, “Just being an inactive administrator of a large group, who may not be able to vet all of its content, is very different from forwarding rumours. People who forward rumours should be responsible enough to at least highlight their doubtful veracity.”
However, if the admin is unconnected to the group activities, he cannot be held for merely starting the group, they say. “Unless, of course, you have started it for an illegal activity or to cause an offence,” says Sunil Abraham, executive director of the Bengaluru-based The Centre for Internet and Society. The WhatsApp admin, they point out, is a mere intermediary. One who isn’t vested with any power, except to add or remove members from the group.
Twenty-three-year-old Hrishikesh, from Dhanbad, is currently admin in six groups. In three, he is one among multiple admins. He hardly keeps tab on the goings-on in this space and is not acquainted with all members, he says. “A WhatsApp admin has no control, no facility to moderate or tweak a message,” says Sugathan. Abraham trots out Section 79 of the IT Act. “It gives the admin immunity from liability that emerges from content posted by the members,” he says. The best way to track the original senders in such cases, he says, is to rope in the help of the telecom department, the other intermediary (in the case of WhatsApp, the owner Facebook) and blend it with some ‘old-fashioned’ detective work.
“Counter bad speech with good speech,” says Abraham, and that is often the best way to deal with rumour-mongering. Instances like those at Solapur and Ahmedabad have been rare, he reasons. “Such stuff can be dealt better with education rather than regulation. All types of nuisance shouldn’t be regulated. The cost of implementing new laws and training police personnel for it is not cheap. In these cases, SMSes from the police could go to every single mobile user in the district, telling them the rumours are false.”
Sugathan concurs with this. Facebook, radio and other mass media should be used by the police to quell rumours, he says. He points out that in the aftermath of the 2011 London riots, although social media was blamed for aggravating the situation, there were ample warnings against shutting it down during such times. “Blocking the medium is blocking an avenue for information. One cannot arrest each and every person. So educating people works better,” says Sugathan. Some like Abraham consider these hiccups inevitable in our evolving use of social media. A new technology is often considered sacrosanct and reliable. “From repeated exposure emerges critical understanding. It will take us another five years to know that Wikipedia is not the source of truth.”