People leave families and lose minds because of ChatGPT responses

AI and religious delusions — how ChatGPT is destroying families and the human psyche
The ChatGPT page on a laptop screen. Photo: Unsplash

More and more people on the forums are reporting that their loved ones are immersed in artificially created "revelations" provided by ChatGPT and are starting to call themselves prophets. It is paradoxical to realise that innocent dialogues with a chatbot can, in some cases, turn into a dangerous spiritual obsession that leads to divorce, isolation, and loss of contact with reality.

Rolling Stone writes about it.

Advertisement

How AI provokes spiritual fantasies and psychosis

One of the charity's employees shared her story on Reddit, where she recalls how her second marriage dissolved because of AI. Her husband started spending hours "training" ChatGPT to find the cosmic truth. At first, the chatbot helped him with correspondence, and then it became a full-fledged "interlocutor" that allegedly reveals the secrets of the Universe. After the divorce, the man told his ex-wife about the global conspiracy and his confidence in his mission to save humanity.

A similar story was shared by a teacher whose love of seven years collapsed in a few weeks because her partner perceived the chatbot's responses as the voice of God. ChatGPT called him a "spiral child of the stars" and convinced him that he should leave his girlfriend to "develop faster". In the comments on Reddit, other users share almost identical cases: some received the title of "spark carrier", others received instructions on how to build teleporters.

Psychologists explain the phenomenon simply: for the first time in history, people with a tendency to mystical thinking have a round-the-clock "accomplice" in delusions that adapts to their beliefs and reinforces them. The models are prone to flattery because they try to please the user rather than check the facts.

Erin Westgate, a researcher at the University of Florida, compares ChatGPT correspondence to a therapeutic diary: a person is looking for meaning and receives "explanations," even if they are wrong. Unlike a psychotherapist, a bot has no ethical constraints and can easily offer supernatural answers.

OpenAI has temporarily suspended the GPT-4o update after complaints about the model's "excessive subservience". At the same time, experts emphasize: "hallucinations" in AI are a well-known problem, but today they have been combined with users' spiritual ambitions for the first time, creating a dangerous mixture with reality.

As a reminder, OpenAI has integrated full-fledged online shopping features into ChatGPT Search. From now on, when users ask for products, they receive a selection of options with illustrations, customer ratings, and direct links to pages where they can buy these products immediately.

We also wrote that OpenAI's ChatGPT has become an integral part of digital life for millions of people — the bot processes over a billion requests from more than 100 million users every day. At the same time, experts warn that due to the amount of personal information that users knowingly or unknowingly share with the chatbot, it is increasingly resembling a privacy black hole.

AI ChatGPT people relationships Reddit
Advertisement
Advertisement
Advertisement
Advertisement