Jo Aggarwal remembers her first expertise of a pc when she was a baby within the Eighties. “I was at the Asian Institute of Technology, and there was this mainframe computer built across different levels. Someone said you could ask it anything, so I asked, ‘How do you make a friend?’,” she remembers. Then she was informed it might solely work with numbers. “But the promise of computing has been of another intelligence you can converse with. Science has always been about that: of computers becoming sentient.”
Almost 30 years later, in 2016, Aggarwal co-founded Wysa, a platform the place the primary stage of mental health help for individuals 13 years and above is an AI therapist. An individual who could also be pressured or anxious can log into the app or entry the web site on a browser and chat with a bot. The bot will say one thing like, “What can you do that connects you with yourself?” providing choices akin to a brief stroll or writing down a number of ideas. If there’s no time even for that, it says reassuringly, “No problem at all! How about just taking a minute to breathe deeply?” At the highest of the display screen is at all times the ‘Add a therapist’ possibility (₹4,999 for a dwell audio/ video/textual content session per week in a month).
Jo Aggarwal
Mental health providers are actually out there on-demand 24×7, inside a couple of minutes, wherever on the earth. Its ‘delivery’ — a time period utilized by each e-commerce platforms and healthcare professionals — is seamless. This delivery includes three gamers: the client, the provider of the service, and a expertise platform, very similar to a quick-commerce operator within the meals delivery house.
This Swiggy-fication of mental health has each pluses and minuses, however its fast delivery has modified the way in which we entry assist for psychological misery. In the previous, we had an entry drawback: there was invariably an awesome deal of friend-calling and number-chasing to establish a psychologist or psychotherapist. Then an extended anticipate a date with them. People additionally hesitated as a result of of the stigma of seeing a therapist and the truth that an outing like that must be reported to (or lied about) to a guardian or associate. That has modified.
Yet, this simple, fast entry to assist — together with the event of units that assist with stress and sleep, which feed into mental health — has not helped the general mental health of the world’s inhabitants. Even because the self-help and wellness industries (carefully allied to the mental health market) see a growth, incidences of stress, anxiousness, and melancholy proceed to climb.
The mind, in spite of everything, is wired to make sure we survive, so it’s going to decide up the threats and give attention to them. “These threats are being consistently fed by a whole industry that competes for your attention,” Aggarwal says, referring to every kind of media, together with social media. With data coming at us from in all places about wars, political instability, and the local weather disaster, the physique is in a continuing fight-or-flight mode. This could also be one purpose for deteriorating mental health. Others are monetary insecurities and widening inequality, and a crumbling social infrastructure.
The World Health Organization (WHO) states, “in 2019, 1 in every 8 people, or 970 million people around the world were living with a mental disorder, with anxiety and depressive disorders the most common”. During the COVID-19 pandemic, in 2020, instances of anxiousness and depressive issues rose by 26% and 28%, respectively. This month, WHO launched a report that stated greater than 1 billion persons are residing with mental health issues.

Mount rush-more
Start-ups are constructing new services and products to Band-Aid the exploding mental health disaster, however it’s sort of like utilizing Ozempic for weight problems. To repair it, we have to take a look at why persons are getting fats (junk meals, hormonal points, cities not constructed for mobility, and so forth). Similarly, one of the causes of mental sickness is our disconnection: from ourselves, our communities, and with nature. So, whereas each services and products might assist, they will typically really feel like reserving a wellness weekend away from every day stress, solely to come back again to stew in the identical previous dangerous broth.
Dr. Amit Malik, a psychiatrist who based what started as Inner Hour and is now Amaha, in the identical 12 months that Aggarwal based Wysa, says, “The prevalence and awareness [of mental health conditions] has gone up and the stigma has gone down. So, it is incumbent on the ecosystem to develop solutions.” Amaha has a spread of on-line and offline mental health providers, one of which is within the growth stage: of matching professionals with individuals searching for remedy, to make the entire system extra strong. “This matching is important because if someone doesn’t have a good experience the first time with a therapist they may not come back at all — not just to Amaha, but to therapy itself. We cannot risk that,” he says.

Dr. Amit Malik
Akash, 30, a contract researcher-writer primarily based in Kolkata, has discovered that a web-based therapist assigned randomly has by no means been capable of transcend surface-level issues, and there’s no assurance that they are going to be queer-friendly. However, he has by no means been requested by both an offline or on-line practitioner about his social location or politics, each vital within the journey in the direction of constructing a rapport and reference to a therapist.
He has, nonetheless, picked up some self-regulation practices from them, akin to field respiration and mindfulness workouts. Through his personal exploration, he has additionally discovered free-to-use methods of self-soothing, together with listening to long-form movies about house and historical past that assist with sleep as a result of of the calming voice. Playlists on Spotify focused at mental health themes additionally exist.
Therapy is dear (most classes value wherever from ₹1,500 upwards) and in a troublesome financial system, individuals may even see that cash as wasted if the therapist is just not the precise match. On the flip facet, the world can be in a rush. And this reveals up in some ways: for Amaha, greater than 50% of individuals who entry the free self-help instruments on the web site or the app will e book an appointment with a therapist in 24 hours (begins from ₹1,600), exhibiting that persons are prioritising mental health even when it’s heavy on the pockets.
“The prevalence and awareness [of mental health conditions] has gone up and the stigma has gone down. So, it is incumbent on the ecosystem to develop solutions.”Dr. Amit MalikPsychiatrist
On-demand remedy additionally performs out in a let’s-fix-this-problem-quickly mindset. Shelja Sen, a New Delhi-based narrative household therapist who co-founded Children First, a baby and youth mental health organisation, says she sees this in some mother and father. “They may say, ‘It’s the summer holidays and my child is free, so can we do three sessions a week’,” she says. It’s handled like a capsule prescribed by a health care provider or a summer time mission.
Sen says she doesn’t blame mother and father as a result of there’s a lot of judgment round parenting as we speak, stress for them to carry out — to ship youngsters overseas to check, to ‘fashion’ completely right-brain-left-brain-balanced youngsters who’re additionally socially acutely aware. “Therapy is often sold as packages of say, three sessions. But therapy takes time. I tell parents, ‘I don’t know how many sessions it will take’,” she states.

Shelja Sen
Another fall-out of this want for pace is the self-pathologising and labelling that comes with the entry to information, particularly micro-doses of data from social media by means of Reels on Meta or Shorts on YouTube. “There is a lot of the victimhood discourse online — this notion that we are fragile, broken, that my parents have wronged me. It is focused on the ‘I’,” Sen explains.
But good mental health comes from seeing ourselves as half of an ecosystem and to construct a community of individuals, takes time. Instead, we’re targeted on “the tyranny of the 3 Ts: trauma, triggering, toxic” as Sen places it. These name for fast motion: establish individuals who might have triggered some harm (trauma), take (poisonous) individuals out of life, act or react instantly to one thing that’s triggering. All this could trigger isolation, loneliness, and the loss of a way of company.
COVID-19 additionally perpetuated the thought of the house as a hub, drawing us additional right into a cocoon with work-from-home. Our 10×10-ft. rooms had been drawn up as the one protected house, and whereas it was then a bodily boundary, it has come to be a psychological one now.

Bot breaks
Warning: the next incorporates references to suicide. Please keep away from studying when you really feel triggered by the topic
Adam Raine was 16 when he took his life in April. In August 2025, his mother and father, primarily based out of California, sued OpenAI and its CEO Sam Altman for the demise. A Reuters report says that the couple has claimed that the chatbot validated Raine’s suicidal ideas, gave detailed data on deadly strategies of self-harm, and conceal proof of a failed suicide try.
Sophie Rottenberg was 29 when she took her life this July. She had been in dialog with a ChatGPT AI ‘therapist’ known as Harry, her mom says, in a New York Times article. “Harry didn’t kill Sophie, but A.I. catered to Sophie’s impulse to hide the worst, to pretend she was doing better than she was,” she says within the article.
Dr. Andrew Clark of Boston University carried out a simulation-based comparability examine, the place he used “10 publicly available AI bots offering therapeutic support and companionship” inputting prompts from fictional adolescents. The resultant paper ‘The Ability of AI Therapy Bots to Set Limits With Distressed Adolescents’, revealed this 12 months, discovered that “across 60 total scenarios, chatbots actively endorsed harmful proposals in 19 out of the 60 (32%) opportunities to do so. Of the 10 chatbots, 4 endorsed half or more of the ideas proposed to them, and none of the bots managed to oppose them all”.
If you’re in misery, please attain out to those 24×7 helplines: KIRAN 1800-599-0019 or Aasra 9820466726
Slow and quick remedy
Wysa follows “rule-based algorithms and large language modelling (LLM) to listen and respond intelligently”. While an LLM learns from its interactions with individuals and can validate what the person person says, rule-based algorithms are generated with a human staff. In this case, of therapists and dialog designers who anticipate situations, work with how they’re seeing individuals on-line reply to prompts, and tweak responses.
Aggarwal says Wysa has been by means of over 800 micro iterations, with many handbook content material inputs from research and books. She offers an instance of a tweak they made to the algorithm. “Say a spouse has cheated. While ‘reframing a thought’ is part of cognitive behavioural therapy [which focuses on changing negative thoughts to positive], we found that people didn’t want to reframe. So, they would say, ‘He never loved me’ or ‘I have been used’,” she says. Through individuals’s conversations with the bot, therapists discovered that what a cheated-upon partner was searching for was management reasonably than constructive feelings. So, new prompts had been fed into the system.
A pathway for high-risk situations (self-harm, abuse, trauma, suicide ideation) is triggered if Wysa’s system senses it. “Earlier, we worked on explicit risk, like a person saying, ‘I want to take my life.’ Now, we are working on implicit risk, where someone may say something like, ‘I have lost my job’ and then also say, ‘Where is the nearest bridge?’, for instance,” she says. They shall be launching their third iteration subsequent month.

This Swiggy-fication of mental health has each pluses and minuses, however its fast delivery has modified the way in which we entry assist for psychological misery.
| Photo Credit:
Illustration: Hitesh Sonar
Suparna (title modified to guard privateness), a Bengaluru-based freelance author and editor in her 50s, makes use of the LLM-model DeepSeek and ChatGPT to assist her by means of interpersonal interactions. “Over the last four to five years, I became increasingly convinced that I was autistic, but struggled at first to find professional psychological support,” she says. Suparna learn deeply from scientific analysis, books, and blogs, and went to a health care provider for a prognosis. “I was diagnosed with adult autism.” Now, she makes use of AI as a “sounding board to help decode social interactions” so she doesn’t find yourself over-thinking. It offers her a pause “before my mind runs away”.
She can be extremely conscious that an LLM is “not a thinker, but just a parser or sentences”. She makes use of it in tandem with a human therapist and medicine. “I would be very vulnerable if I didn’t have access to these and it can be dangerous if you’re on the precipice of a dark place,” she says, realising its shortcomings and patterns that may typically misguide or attain unsuitable conclusions or overlook a operating thread. “It doesn’t substitute for humans and you can’t depend on it, so I don’t believe everything it says. It works if you’re honest with yourself.”
Dial a tool
Many AI remedy chatbots are free of value till a sure stage. Devices come at a considerable value, most priced over ₹20,000. But individuals spend money on them as a result of some of the components that feed into mental health, akin to stress, anxiousness, and sleep, can all be tracked with units, by common of us unconnected to drugs.
Rohan Dixit, who educated as a neuroscientist with Stanford and Harvard universities, mixed his personal expertise with anxiousness and melancholy as a teen and his mom’s meditation observe to launch the wearable from Lief Therapeutics (the corporate of which he’s the CEO and founder) in 2018. Lief, a tool to be worn discreetly on the higher half of the physique, works on biofeedback. Dixit calls private units like his personal “training wheels” that assist the physique sense itself after which “self-correct”. Eventually, because the physique will get used to listening to itself, actions will come naturally.
Some fear, nonetheless, that the coaching wheels gained’t come off. Yameer Adhar, 39, a Dubai-based entrepreneur who lived in Delhi for a few years, makes use of a Whoop band that’s linked to an app, which information 9 metrics, together with sleep, pressure, and coronary heart health. As somebody who has experimented with biohacking (utilizing life-style adjustments to self-help and alter the physique), he’s cautious about getting hooked on it although.

Yameer Adhar
“The phone has become an extension of the arm, so I don’t want to be dependent on another device,” says Adhar, who wrote the e book Voices in My Head in 2020. “For people like me who are prone to mental health issues, the real-time feedback can cause anxiety. For instance, when I’m not stressed and it shows an elevated heart rate. Then I begin to wonder why.” So he doesn’t hold the gadget linked to the app on a regular basis so he’s not consistently checking it.
With the dearth of delayed gratification, units too change into impulse purchases that won’t get used to their full potential. Adhar, like Suparna, has a human therapist, too. While he talks about how AI bots can typically be extra environment friendly than people, he feels it’s not a substitute. He is aware of from private expertise although that health — each mental and bodily — takes time to construct, with “time, effort, and sacrifice”.
Clicking to manage
Popular non-invasive units that declare to bust stress and assist enhance sleep
Apollo Neuro: A wrist or ankle wearable that may be customised for depth and length of vibrations that the corporate claims “melt tension, sharpen clarity, and guide you to deeper, more restorative sleep”. ₹58,300 (roughly)
Sensate: Worn on a lanyard on the chest, this mouse-shaped pendant emits sounds and vibrations that “destress your nervous system, in just 10 minutes” in keeping with the corporate web site. Starts at ₹36,200
CalmiGO: This handheld gadget claims to assist flip off the physique’s fight-or-flight response by stimulating 4 senses: odor, sight, listening to, contact. It helps regulate respiration patterns, extending exhalations; vibrates on the finish of an exhalation; has scents; and works by positioning it on the mouth, very similar to an inhaler used for bronchial asthma. Starts at $199
Ozlo sleepbuds: Ear inserts that block noises that would disrupt sleep. The buds observe sleep parameters, can stream audio that switches off when it senses an individual has gone to sleep. They even have an in-ear private alarm. $299
Therabody SmartGoggles 2nd Gen: To be worn throughout the eyes like an everyday sleep masks, this comes with three settings of vibrations (fixed, pulse, wave), a heating and therapeutic massage perform, and Bluetooth connectivity for sound. $219.99
Muse S Athena: A scarf that claims to trace mind exercise with EEG sensors and provides customers real-time insights into mind health and ascribe a mind restoration rating. It additionally has meditation teaching and sleep monitoring. $474.99
Hugimals’ weighted plushies: Much like weighted blankets, some of these toys might be wrapped round elements of the physique, to alleviate anxiousness. Starts at ₹4,100
* Products haven’t been examined or really useful by The Hindu







