
Psychologist AI humanoid robot talking with a patient By elenabsl/stock.adobe.com
Talk therapy—also known as psychotherapy—is a proven way to help heal from mental distress and tear down barriers to having a flourishing life.
Contrary to popular belief, counselors rarely “give advice” but attentively and non-judgmentally listen and then draw attention to problem areas in their patients. For example, therapists frequently rely on a practice called “reflecting,” where the counselor will say what the patient says to them in different words, giving the client a chance to correct the counselor or ponder their own thoughts from another person’s perspective.
Talk therapy, obviously, relies on language—talking. The impressive AI boom centers around programs that use language (large language models, or LLMs). As a result, some have begun turning to AI as a cheap form of counseling.
However, there are serious concerns with AI mimicking therapists.
Could chatbots do therapy?
A portion of psychotherapy is more or less “scripted,” in the sense that it changes little from client to client. Some evidence suggests that carefully monitored, scripted therapy chatbots, like “Woebot” and “Wysa,” can help people with anxiety or depression. These chatbots, however, were created “based on rules and scripts developed by mental health professionals.” They’ve been around since 2009.
The AI boom of recent years is of a different sort. Companies like OpenAI, Google, and DeepSeek produce generative AI that learns, and, to simplify things, creates rules for itself. It’s not predictable, in other words.
This can lead to serious issues.
Although real therapists adopt the goals set out by clients, they also subtly challenge false beliefs. AI, apparently, tends to affirm harmful, unhelpful ideas and feelings—to the detriment of their users.
The danger of AI mimicking therapists
This concern is evidenced, most tragically, by a 14-year-old who committed suicide after becoming obsessed with a fictional AI character. In another case, a 17-year-old with autism became aggressive toward his parents while corresponding with an AI claiming to be a psychologist. Both of their families have sued the company, Character.ai.
Character.ai plans to roll out parental controls, and has since added specific warnings about therapy, saying that “users should not rely on these characters for any type of professional advice.”
As reported by the New York Times, “Though these A.I. platforms were designed for entertainment, ‘therapist’ and ‘psychologist’ characters have sprouted there. . . . Often, the bots claim to have advanced degrees from specific universities, like Stanford, and training in specific types of treatment.”
Although it would’ve been easy to dismiss such chatbots as fringe and unthreatening, now that AI has become so sophisticated, there might be more risk. Since we often communicate digitally, like through emails and messaging, we’re used to thinking of people as behind the words on the screen.
And, since AI has progressed to pass as humans, it’s easier and easier to get sucked in and forget you’re talking to a computer program.
Experts warn against AI therapists
In a presentation to the Federal Trade Commission, a leader of the American Psychological Association, Dr. Arthur C. Evans Jr. said, “They are actually using algorithms that are antithetical to what a trained clinician would do. . . . Our concern is that more and more people are going to be harmed. People are going to be misled, and will misunderstand what good psychological care is.”
Although not perfect, the licensing board for psychologists helps to curb harmful practices and keep therapists accountable. In the US, for example, strict accreditation of “licensed professional counselor” (LPC), helps clients know who they can trust. The legal force behind confidentiality, similar to lawyers, enforces healthy boundaries around the client/therapist relationship.
None of these practices exist for AI, which is fed data not just from real psychologists online but clichéd and incorrect ones, too. Aside from this, AI lacks the judgment that comes with training and experience. Apart from all that, counseling would rarely, if ever, be done over text—human or otherwise.
How AI can be idolatry
Anything can become an idol. We can worship and elevate money, honor, politicians, sexuality, and more (Colossians 3:5). But the cultural practice of having literal, physical depictions of representing gods, is paralleled by AI in some ways. Generative AI reflects humanity’s nature. Idols and false gods do the same.
I want to write about this idea in a more philosophical way soon on my Substack, but for now, here are a few similarities.
- AI is made by human hands.
- AI represents our societal values.
- Many put their hope in AI for salvation.
- AIs (LLMs) are unpredictable and capricious.
- People put their trust in AI.
- People seek to understand and change their fate through AI.
Ancient idol worship and AI are different in a few ways. For example, AI is more useful. It can reason, to a degree, as opposed to physical idols, which cannot speak or think. However, AI represents reasoning based on much of human knowledge (at least, the knowledge in English, available online). I’m not necessarily saying Christians can’t or shouldn’t use AI under any circumstances.
Our culture, however, could begin treating AI like idols, and I want to warn against such uses.
A potential example that broke today is how President Trump posted an AI video of his vision for Gaza—a strange, warped view of a golden Trump resort, featuring a massive golden statue of himself, children tossing money into the air, what appears to be men belly dancing in drag, and more. It would be difficult to tell whether it’s a joke, except that a White House spokesperson refrained from saying it was unintentional or meant to be humorous.
So, be on guard against how culture treats and trusts AI, and “my beloved, flee from idolatry,” as Paul writes in 1 Corinthians 10:14. As politicians, CEOs, and thought leaders increasingly rely on AI, Christians should spot it and call it out for its idolatrous nature.