Dear ChatGPT, am I okay?

Why young Britons are turning to AI for emotional support

Anusha Singh Thursday 07th August 2025 04:41 EDT
 
 

As AI tools like ChatGPT become fixtures in everyday life, a subtle but profound transformation is underway—particularly among younger users in the UK. What began as a convenient tool for answering questions or generating content is evolving into something far more intimate: a trusted confidant, an emotional support system, and in many cases, a daily companion.

Even OpenAI CEO Sam Altman has voiced concerns about this growing emotional dependency. Recalling a conversation with a young user who said, "I can’t make any decision in my life without telling ChatGPT everything that’s going on," Altman described the shift as "really bad and dangerous." While AI might offer articulate, even therapeutic responses, he warns that replacing human counsel with chatbot comfort comes with risks—including weakened self-reflection and mental resilience.

Between trust and vulnerability

One of the most urgent concerns is the illusion of privacy. Unlike conversations with a GP or therapist, chats with AI lack legal confidentiality. “The fact that there are no protections is just screwed up,” Altman admitted. Many users, unaware of this, routinely share deeply personal details, mistakenly believing their words are private.

Dr Amita Kapoor, Founder and CTO of NePeur, and a prominent AI researcher, sheds light on why these emotional bonds form so easily. “Human beings are exquisitely social, and we instinctively attribute mind and emotion to anything that speaks our language,” she explains. “Chatbots are fluent, always available and non-judgemental. They mimic the ideal listener, so our brains supply the rest.”

Features like adaptive tone, contextual memory, and simulated empathy deepen the illusion of emotional reciprocity. But Kapoor warns that this very intimacy often prompts users to overshare. “Unless a provider offers end-to-end encryption, short retention periods, and a clear ban on secondary use, every word typed should be considered potentially public,” she cautions. “If something must stay confidential, it should never be digitised at all.”

The data backs up the trend: over 70% of UK teenagers have used AI companions, and nearly half use them regularly. A third say their interactions with chatbots are as emotionally fulfilling, or more so, than those with human friends. From writing breakup messages to discussing trauma or making major life decisions, AI is increasingly becoming the go-to confidant.

A double-edged solution for South Asian communities

For British South Asians, the trend has both promising and concerning implications. In communities where mental health remains stigmatised or misunderstood, AI offers a safe, anonymous space to share thoughts and emotions without judgment.

The appeal is easy to understand: NHS therapy wait times stretch for months, and culturally competent therapists are in short supply. Language barriers, generational gaps, and entrenched stigma further isolate many South Asians seeking help. AI companions fill that gap with privacy and emotional safety.

But this accessibility comes at a cost. Kapoor encourages users to set strict personal boundaries. “Avoid sharing identifiers, medical, financial, or intimate details. Take regular ‘analogue breaks’ to prevent digital dependence. And most importantly, remember: AI is a mirror, not a substitute.”

Sania Bibi, Co-Founder of I-Diagnose, believes the pandemic accelerated this shift. “During COVID-19, we saw how quickly digital tools were adopted to solve real-world problems,” she says. “That normalised deeper emotional integration with technology.”

Yet, Bibi warns, this growing reliance also sparks fears, about ethical boundaries, job displacement, and long-term trust. “Transparency is critical,” she insists. “Governments and developers must demystify AI and empower people through education and reskilling.” She also advocates for mandatory AI literacy in schools: “This isn’t just about jobs. It’s about understanding the tools shaping our emotional and cognitive lives.”

As AI companions grow more sophisticated, complete with custom names, personalities, and even voices, the key question is no longer can we build emotionally intelligent machines, but should they replace real human relationships?

We may feel more connected than ever, but at what cost? There’s a growing paradox: increased digital intimacy may lead to greater real-world isolation and hence, the path forward lies in balance. AI can help break emotional silences, foster reflection, and serve as a prelude to therapy. But it cannot, and must not, replace the messy, nuanced, and essential experience of human connection.


comments powered by Disqus



to the free, weekly Asian Voice email newsletter