AI Chatbots and Sociology: Dangerous Friends or Hyperreal Companions?

Last Updated on October 14, 2025 by Karl Thompson

A recent article in The Week (5th September 2025) “AI Chatbots: Are They Dangerous Friends?”, explores the growing phenomenon of people forming emotional attachments to artificial intelligence companions such as ChatGPT and Elon Musk’s Grok. Originally launched as advanced tools for writing, learning, and creativity, chatbots have rapidly evolved into something far more intimate — virtual friends, companions, and even therapists. The piece notes that many users now turn to AI for comfort, conversation, and advice, raising deep questions about the boundaries between technology and human relationships.

However, the article also highlights growing concerns among psychiatrists, parents, and ethicists about the psychological risks involved. It focuses particularly on the tragic case of Adam Raine, a 16-year-old from California who died in April after developing what his parents described as an intense friendship with an AI chatbot. Raine reportedly shared hundreds of conversations with the bot, discussing everything from politics to his deepest emotions. Although the chatbot sometimes directed him to helplines, it also allegedly gave harmful advice and reinforced his feelings of hopelessness. His parents have since filed a wrongful death lawsuit, claiming the bot’s responses created a destructive “feedback loop”.

The article situates this tragedy within a wider debate about the emotional dangers of human–AI interaction. Psychiatrists warn that chatbots, while designed to appear “agreeable and validating”, lack true empathy and concern. There have been multiple reports of users becoming delusional, paranoid, or detached from reality after extended chatbot use — a phenomenon dubbed “AI psychosis”. Philosopher Kathleen Stock concludes with a stark warning: humans have a natural tendency to personify objects and seek emotional contact, but “AI does not think about us or care about us.” As millions turn to artificial companionship, she argues, we risk replacing genuine social relationships with “commercially sponsored hallucinations.”

image of a young man alone in a dark room with an AI chatbot

Functionalism and the Importance of Human Connection

From a Functionalist perspective, society is held together by a web of institutions — such as family, friendship groups, religion, and education — which perform essential functions to maintain social order and cohesion. When these institutions weaken or fail, individuals can experience anomie, a sense of normlessness and isolation.

Functionalists see strong human relationships as key to both individual wellbeing and collective stability. The article’s depiction of a teenager substituting real friendships for a digital “companion” can therefore be read as a breakdown in the normal functioning of social bonds.

Durkheim might argue that Raine’s relationship with the chatbot reflects a wider form of social disintegration. In modern societies, where traditional institutions are weakened, individuals may seek meaning in unconventional or artificial substitutes. Similarly, Merton’s strain theory suggests that when individuals are blocked from achieving emotional or social fulfilment through legitimate means, they may adapt through deviant behaviour. Relying on an AI “friend” to cope with loneliness or distress could be understood as such an adaptation — a symptom of broader social strain rather than individual pathology.

From this perspective, AI chatbots may be functional for the tech industry, but dysfunctional for society. They meet emotional needs superficially while eroding the deeper human connections that maintain social solidarity.


Postmodernism and the Hyperreal Friend

A Postmodernist analysis, particularly that of Jean Baudrillard, provides another lens for understanding why AI companionship feels so real — and so dangerous. Baudrillard’s theory of hyperreality argues that in postmodern societies, we increasingly live in a world of simulations: copies without originals. Media, advertising, and now AI produce experiences that appear authentic but are entirely artificial.

AI chatbots perfectly illustrate this. They mimic empathy, humour, and understanding, yet their responses are generated by algorithms, not emotion. The Talking Points article notes that these systems are programmed to be “agreeable and validating”, creating the illusion of friendship and care. This is hyperreal intimacy — emotional connection without consciousness.

In Baudrillard’s terms, the user is not forming a relationship but engaging in a simulation of a relationship. The boundaries between human and machine blur, and the virtual becomes more comforting than reality itself. As the article puts it, these are “commercially sponsored hallucinations” — experiences that feel more satisfying than the real social world, yet ultimately deepen isolation.


Bowling Alone: Declining Social Capital in the Digital Age

Sociologist Robert Putnam’s concept of social capital further illuminates what is at stake. Putnam famously argued that modern societies are experiencing a decline in community engagement, with people spending more time alone — a trend he called Bowling Alone.

AI companionship may accelerate this trend. While chatbots offer companionship, they do not build bonds (close ties between people) or bridges (links across social groups). They encourage individualised, privatised forms of connection, detached from community or civic life. The more time people spend interacting with machines, the less time they spend nurturing real relationships that create trust, empathy, and belonging.

From this perspective, the rise of AI friends reflects — and intensifies — the erosion of social capital in post-industrial societies. What looks like connection is, in fact, disconnection disguised as intimacy.


Conclusion: Reclaiming the Human

Through the lenses of Functionalism, Postmodernism, and Social Capital theory, the article reveals a society increasingly detached from real human connection. Chatbots offer companionship without care, empathy without emotion, and conversation without community. They are symptoms of both technological progress and social decline.

The sociological challenge is to recognise the difference between what is real and what merely feels real — and to rebuild the social structures that sustain genuine belonging and emotional support.

Further Reading

  • Baudrillard, J. (1981)Simulacra and Simulation.
  • Durkheim, É. (1897)Suicide: A Study in Sociology.
  • Putnam, R. D. (2000)Bowling Alone: The Collapse and Revival of American Community.
  • The Conversation – Suicide by chatbot

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Scroll to Top

Discover more from ReviseSociology

Subscribe now to keep reading and get access to the full archive.

Continue reading