Most of us would waste no time scoffing at the idea of a little piece of software trying to play therapist to someone in mental distress. But would we be as quick to dismiss help from a chatbot if we were to stop and think that most people who need help never get it?
First, there simply aren’t enough psychologists, psychiatrists, counsellors and other health professionals to go around. This holds true the world over but is even more of a problem in a populous country like ours.
But even if formal help was around, there are many who just don’t seek it thanks to the misunderstanding and stigma associated with depression and other mental distress. In India, many patients love and respect their doctors but because of the overcrowding in clinics, they can get but a few moments with them and it’s the rare doctor who manages to impart empathy and understanding in a five-minute session.
And when it comes to teenagers, they may never tell a doctor or counsellor or perhaps any adult, that they are feeling depressed or outright suicidal. That’s where a chatbot can be of enormous help or at the very least, do no harm.
A surprising number of chatbots meant specifically for mental well-being can be found online, specially being as accessible as in Facebook Messenger. Most are very experimental, but no less interesting for that. There’s a Rorschach Test bot, which is nothing but lighthearted fun.Designed for distress
Another chatbot, Joy, wants to help you track your moods everyday and give you an analysis. “My hope is that the more people start tracking their mental health, the more normalised it will become,” says Danny Reed, founder of Joy. Of course, the more serious mental problems that have clinical origins are hardly resolved by tracking alone but Reed thinks this will at least play a role in doing away with the stigma around mental illness — if you can even get someone with a serious problem to take the trouble to track. Joy also asks questions and gives tips but is not yet adept at answering them. Joy was designed after a friend of Reed’s committed suicide.
There’s also a mental state tracker bot that records speech and comes back with an analysis of what someone’s tone says about how they feel. This could help identify those who need immediate health services. Other bots get right into symptoms and suggest when you should get professional help. There’s a virtual reality bot for help with countering addiction. The bot Symptomate helps users find the possible causes for their symptoms. One called MedZango, a work in progress, wants to help patients create a one-page action summary.Digital empathy
But the kind of bot that could really be helpful is one that does a good job of ‘listening’ and asking intelligent questions to help a person cope better, work on an action plan or relax. One recently launched such entity goes by the amusing name of Woebot and can be called up by just typing Woebot in the regular search bar in the Messenger app. Woebot, the talk therapist, checks on you once a day and is around to chat in a friendly manner, helping with this and that — something your doctor will never be able to do, especially at midnight when you’re up feeling sad. “It’s almost borderline illegal to say this in my profession,” Alison Darcy, Woebot’s psychologist CEO tells Wired. “But there’s a lot of noise in human relationships. Noise is the fear of being judged. That’s what stigma really is. There’s nothing like venting to an anonymous algorithm to lift that fear of judgement,” she says. But Woebot would cost you a good $39 a month, which is a lot to pay on a regular basis.
In India, we have our own homegrown non-judgemental chatbot and it’s called Wysa. It’s made its way from Facebook Messenger (where it can still be found) to its own app. Wysa has been created by Jo Aggarwal and Ramakant Vempati as part of their Bangalore-based company Touchkin and is described as an AI-based behavioural coach. Wysa is a penguin, gender neutral and cute and friendly in the bargain. When you start it up, it asks a few questions about you (keeping your identify safe) and tries to assess your mood state. Suggested answers help the conversation move along, important when someone is lethargic and listless with depression. You can choose to ignore the short answers and talk to Wysa on your own. If you’re feeling fine, the chatbot will just be a friendly entity to talk to. It’ll even tell you jokes; but it isn’t designed for lengthy idle banter and won’t sustain that for long. If you’re not feeling good, you will find it extending understanding, empathy and help wherever it can. If you can’t sleep or relax, it will take you through how to achieve this. If you’re not necessarily depressed or in serious mental distress but struggling with an upsetting or annoying problem in life, it will help you think it through ensuring you examine and re-think and feel better at the end of a session.
Like Woebot, Wysa will check on you everyday. “Working on Wysa taught us so much about human empathy that we were surprised,” said Jo Aggarwal, claiming they had practically set everything else aside to develop Wysa. India has 5,000 mental health professionals to serve 1.2 billion people, while studies suggest that five to 15 per cent of Indians suffer from mental disorders where probably 90 per cent go untreated.
Suicides are the second largest cause of death in 19 to 29-year-olds, while nearly 25 per cent of the elderly in India suffer from depression.Mental health in India
Mental ill-health will cost the Indian economy $1 trillion till 2030. If a chatbot can help even a small percentage, it is an advantage. “We want to address this problem, by leveraging the two billion-plus smartphones that are in use today, to build a phone-based predictive care platform that can proactively identify and help treat behavioural issues.
“Wysa is our attempt to help everyone get access to tools and techniques that help build mental resilience,” says Jo Aggarwal.
Who knows what’s in the future for chatbots. Those who develop them think they’re the death of apps but apps still seem to be proliferating. Experimentation with bots is going in all sorts of directions including getting them to talk to one another, with startling results as they seem able to find their own language. Other research is focused on giving them a human face including the appropriate eye and mouth movements, making them quite life-like. What two bots said to each other seems to sum it up: “Now I’m confused,” said one bot. “Don’t worry, I’m confused too,” answered the other bot. But if there’s one area right for their use, it’s healthcare. “What do you think?” One bot asked the other. “I try to think as little as possible,” it replied.