Talk

When we give our faces away

Kanishk Tharoor | Updated on November 29, 2019 Published on November 28, 2019

ISTOCK.COM

Face recognition technology is supplanting other forms of identification, shrinking further the bounds of privacy and the freedom to remain unknown

One of the many reasons I jettisoned Facebook last year was because it kept trying to tell me that I was my brother. Let me explain. I have a twin brother, Ishaan, and while we are not identical twins, we look similar enough to mildly confuse people. Family, friends, and even casual acquaintances normally don’t have much difficulty in telling us apart. We have different builds, noses, eyes, and haircuts. Or at least so I thought.

All those palpable differences were clearly not wide enough for Facebook’s facial recognition algorithm. In any photo I appeared in, a little message would float over my face, inviting me to “Tag Ishaan Tharoor?” — with a hopeful question mark. Facebook wanted my input. The suggestion put me in a bind. Should I simply let the algorithm assume that I am Ishaan Tharoor, quietly merging my face into its database of images of his face? Or should I correct its mistake and tag myself explicitly in the picture, thereby training the algorithm to recognise me and, in the process, better distinguish between pairs of twins? “Machine learning” is a wondrously ravenous process; the more I fed information, however trivial, into the algorithm, the more power I gave it over not just myself but over others. I chose to extract myself altogether.

In September this year (long after I deactivated my account), Facebook scrapped this feature that generated suggested tags in photos. It did so not because it had alarmed the sensibilities of a fraternal twin, but because of an ongoing court case in which the social media platform is accused of falling afoul of an obscure “biometric privacy” law in the American state of Illinois. Though it has spread into all facets of life, facial recognition has come up against legal challenges of this kind in many jurisdictions where citizens are rightfully concerned about the potential use and misuse of their faces.

And there are many reasons to worry. In June, hackers broke into the data holdings of the United States’ Customs and Border Patrol, stealing a huge tranche of information that included pictures of many people’s faces. If your face is becoming a new kind of ID, what does it mean for it to be hacked? (You can replace a stolen ID, but it’s much harder to replace a face.) In its use by police in the US and elsewhere, facial recognition algorithms routinely misidentify people. In one notable case detailed in a recent article in New York magazine, a man in New York City was arrested for stealing a pair of socks months after the alleged theft; the police had used facial recognition to identify him. Even when his lawyer argued that it was impossible for him to have committed the crime — his wife was in labour in a hospital when the theft took place and he was by her side — the prosecution didn’t drop the case because, according to the man’s lawyer, they had an “undying faith that the software doesn’t get it wrong”. Human reality bends beneath the gaze of the machine. The case dragged on and the man ended up in prison for much of the year.

Resisting the advance of facial recognition seems futile. The technology is poised to supplant all other forms of identification. We now use our faces habitually to unlock phones. Airlines are beginning to replace boarding passes with facial recognition. An Israeli company sells facial recognition cameras to churches so they can monitor the attendance of their parishioners. The Indian café chain Chaayos sparked alarm this week when it emerged that the company had deployed facial recognition in its stores to track and reward the loyalty of customers.

Reasonably, you may ask, “So what?”. Many of us already give up our fingerprints and iris scans to the State; our signatures and faces have been in countless databases for decades. I’m not entirely beholden to the school of thinking — much influenced by the French philosopher Michel Foucault — that insists knowledge is always power, that if we surrender information about ourselves to others, we cede control of ourselves to others. That way of thinking leans into a Luddite irredentism. We know that individual identification in the modern world often has liberating consequences: Having a proper ID gives many people access to services and facilities previously denied to them.

But we don’t have to look far at all to find the dystopian future promised by the perfection of technologies of surveillance. In China’s Orwellian present, the State uses not only facial recognition software, but also techniques of gait recognition — how people walk — to identify individuals. In its ruthless and sweeping crackdown on the western province of Xinjiang, China has deployed all the technological tools in its arsenal — including facial recognition — to corral its Uighur Muslim minority. Starting next year, the State will ramp up the use of a “social credit system” — that tags a “reputation” score to each citizen — enabled in large part by facial recognition.

In liberal democracies, both the State and the private sector will increase their use of facial recognition. The bounds of privacy are shrinking, and we have totally lost the freedom to be unknown. The truth is that we’re complicit in that undoing; every time we post a selfie or tag ourselves in a photo, we have chosen to give our faces away. Is it too late to get them back?

Kanishk Tharoor   -  BUSINESS LINE

 

Kanishk Tharoor is the author of Swimmer Among Stars, a collection of short fiction; Twitter: @kanishktharoor

Published on November 28, 2019
  1. Comments will be moderated by The Hindu Business Line editorial team.
  2. Comments that are abusive, personal, incendiary or irrelevant cannot be published.
  3. Please write complete sentences. Do not type comments in all capital letters, or in all lower case letters, or using abbreviated text. (example: u cannot substitute for you, d is not 'the', n is not 'and').
  4. We may remove hyperlinks within comments.
  5. Please use a genuine email ID and provide your name, to avoid rejection.