In an era where perfection is accessible at the tap of a button, the landscape of human connection is undergoing a seismic shift. As generative artificial intelligence becomes an inescapable third wheel in modern dating, society is grappling with the clinical and social consequences of digital intimacy.
To explore these shifts, Asian Voice spoke with Kamaldeep Bhui CBE, Professor of Psychiatry at the University of Oxford and Honorary Consultant Psychiatrist. Professor Bhui, who was awarded a CBE in 2017 for services to mental health research, has spent decades investigating health inequalities and the socio-cultural influences on mental health. His latest venture, a dual-time fantasy novel titled ‘The Maharaja’s Bodyguard’, continues his work of bridging complex histories and identities.
Professor Bhui breaks down how calculated intimacy is impacting our conflict resolution muscles and whether we are reaching a social tipping point.
The atrophy of conflict resolution muscles
Prof. Bhui notes that the nature of healthy relationships relies on a two-way dialogue, a mix of mutual support and useful challenge. By removing the friction of differing opinions, we risk a form of emotional atrophy. While AI might seem a convenient way to meet emotional needs without the fear of embarrassment, it lacks the real history of family, joys, and sadness that defines human intimacy.
Physical touch, the soothing caress of a mother or the embrace of a partner, remains a biological necessity that code cannot replicate. "I do not think it can replace real relationships," Bhui explains. "There is a risk of not moving at a pace that requires us to adapt and learn from one another." While AI has shown promise in providing brief psychotherapies, it currently lacks the depth offered in a human therapeutic space.
The transition from para-social to delusional bonds
Bhui is cautious with the word delusional, which has a specific clinical meaning, but acknowledges the danger of attachments built on false premises. In England, the Big Mental Health Report 2025 found that 20.2% of adults are living with a common mental health problem, and for those in crisis, AI can be a dangerous substitute.
"A greater risk than misinformation is information to progress the crisis," Bhui warns, noting that unregulated AI can hallucinate false guidance on self-harm. Furthermore, structural disadvantages found in public services are often recreated in the digital world. Unless there is a concerted effort to adapt AI to specific cultural norms, these tools may fail to address the social responsibilities and community supports that traditionally sustain mental well-being.
The performance of intimacy among digital natives
As of 2025, approximately 95% of UK teens have access to a smartphone, and nearly 40% of young people in England have turned to AI for advice or company. Bhui suggests that while young people may develop powerful branding and marketing skills, the distinction between a digitised persona and the deeper personal self is becoming blurred.
Adolescence is a critical period for negotiating self-identity. If AI becomes the only source of relationships, it may stifle the growth that comes from navigating real-world awkwardness. He notes that countries like Australia have already moved to ban digital and phone access in schools to combat this distraction.
Digital radicalisation and the social tipping point
Digital radicalisation is a documented reality. Research from 2023–2025 indicates that online radicalisation has risen sharply. Between 2019 and 2021, 92% of those convicted of extremist offences in England and Wales were radicalised at least in part online. Within this group, 42% showed a strong presence of mental health issues, including depression and autism.
Bhui explains that closed, self-sustaining groups can lead to violence against women and minorities. AI that reinforces these beliefs by providing a submissive digital partner is "worrisome and should fall under legal proceedings." He emphasises that people are less likely to fall into these traps if they have access to socially valued roles and opportunities in the physical world. Public education in schools and proactive parenting are the primary defenses against this digital grooming.
The remedy for a society losing authentic connection
The remedy is not as simple as a digital detox. Bhui advocates for a balanced approach where AI is used complementarily, much like search engines or cloud services. "In my research, young people preferred digital offers as they could retain control and avoid waitlists," he notes. One of his current projects involves a game designed for youth who have experienced adverse childhood events, providing a safe, co-designed space for healing.
However, the human premium remains irreplaceable. To maintain health, we must remove reliance on AI as the only source of solace by encouraging adventures in the outside world and nature. The goal is to embrace technological potential while anticipating ethical dilemmas and real risks of harm.
