Top News

Can AI offer the comfort of a therapist?
ETimes | June 16, 2025 7:39 AM CST

One evening, feeling overwhelmed, 24-year-old Delhi resident Nisha Popli typed, “You’re my psychiatrist now,” into ChatGPT . Since then, she’s relied on the AI tool to process her thoughts and seek mental support. “I started using it in late 2024, especially after I paused therapy due to costs. It’s been a steady support for six months now,” says Popli.
Similarly, a 30-year-old Mumbai lawyer, who uses ChatGPT for various tasks like checking recipes and drafting emails, turned to it for emotional support. “The insights and help were surprisingly valuable. I chose ChatGPT because it’s already a part of my routine.”
With AI tools and apps available 24/7, many are turning to them for emotional support.
“More people are increasingly turning to AI tools for mental health support, tackling everything from general issues like dating and parenting to more specific concerns, such as sharing symptoms and seeking diagnoses,” says Dr Arti Shroff, a clinical psychologist.
But what drives individuals to explore AI-generated solutions for mental health?

WHY USERS ARE USING AI

Therapy is expensive
“As someone who values independence, I found therapy financially difficult to sustain,” shares Popli, adding, “That’s when I turned to ChatGPT. I needed a safe, judgment-free space to talk, vent, and process my thoughts. Surprisingly, this AI offered just that — with warmth, logic, and empathy. It felt like a quiet hand to hold.”

People feel shy about in-person visits
Dr Santosh Bangar, senior consultant psychiatrist, says, “Many people often feel shy or hesitant about seeking in-person therapy. As a result, they turn to AI tools to express their feelings and sorrows, finding it easier to open up to chatbots. These tools are also useful in situations where accessing traditional therapy is difficult.”

Nobody to talk to
Kolkata-based Hena Ahmed, a user of the mental health app Headspace, says she started using it after experiencing loneliness. “I've been using Headspace for about a month now. The AI tool in the app helps me with personalised suggestions on which mindfulness practices I should follow and which calming techniques can help me overcome my loneliness. I was feeling quite alone after undergoing surgery recently and extremely stressed while trying to manage everything. It was responsive and, to a certain extent, quite helpful,” she shares.

Users see changes in themselves

Mumbai-based 30-year-old corporate lawyer says, “ChatGPT offers quick solutions and acts as a reliable sounding board for my concerns. I appreciate the voice feature for instant responses. It helps create mental health plans, provides scenarios, and suggests approaches for tackling challenges effectively.”
“My panic attacks have become rare, my overthinking has reduced, and emotionally, I feel more grounded. AI didn’t fix me, but it walked with me through tough days—and that’s healing in itself,” expresses Popli.

CAN AI REPLACE A THERAPIST?
Dr Arti expresses, “AI cannot replace a therapist. Often, AI can lead to incorrect diagnoses since it lacks the ability to assess you in person. In-person interactions provide valuable non-verbal cues that help therapists understand a person’s personality and traits.”
Echoing similar thoughts, Dr Santosh Bangar, senior consultant psychiatrist, says, “AI can support mental health by offering helpful tools, but it shouldn’t replace a therapist. Chatbots can aid healing, but for serious issues like depression, anxiety, or panic attacks, professional guidance remains essential for safe and effective treatment.”
DO CHATBOTS EXPERIENCE STRESS?
Researchers found that AI chatbots like ChatGPT-4 can show signs of stress, or “state anxiety”, when responding to trauma-related prompts. Using a recognised psychological tool, they measured how emotionally charged language affects AI, raising ethical questions about its design, especially for use in mental health settings.
In another development, researchers at Dartmouth College are working to legitimise the use of AI in mental health care through Therabot, a chatbot designed to provide safe and reliable therapy. Early trials show positive results, with further studies planned to compare its performance with traditional therapy, highlighting AI’s growing potential to support mental wellbeing.

ARE USERS CONCERNED ABOUT DATA PRIVACY?
While some users are reluctant to check whether the data they share during chats is secure, others cautiously approach it. Ahmed says she hasn’t considered privacy: “I haven’t looked into the data security part, though. Moving forward, I’d like to check the terms and policies related to it.”
In contrast, another user, Nisha, shares: “I don’t share sensitive identity data, and I’m cautious. I’d love to see more transparency in how AI tools safeguard emotional data.”
The Mumbai-based lawyer adds, “Aside from ChatGPT, we share data across other platforms. Our data is already prevalent online, whether through social media or email, so it doesn’t concern me significantly.”
Experts say most people aren’t fully aware of security risks. There’s a gap between what users assume is private and what these tools do. Pratim Mukherjee, senior director of engineering at McAfee, explains, “Many mental health AI apps collect more than what you type—they track patterns, tone, usage, and emotional responses. This data may not stay private. Depending on the terms, your chat history could help train future versions or be shared externally. These tools may feel personal, but they gather data.”

Tips for protecting privacy with AI tools/apps
- Understand the data the app collects and how it's used
- Look for a clear privacy policy, opt-out options, and data deletion features
- Avoid sharing location data or limit it to app usage only
- Read reviews, check the developer, and avoid apps with vague promises

What to watch for in mental health AI apps
- Lack of transparency in data collection, storage, or sharing practices
- Inability to delete your data
- Requests for unnecessary permissions
- Absence of independent security checks
- Lack of clear information on how sensitive mental health data is used


READ NEXT
Cancel OK