Are you in a relationship with AI?: When a therapeutic tool becomes the third person in the room2/24/2026 More and more people are using AI to assist with their mental health needs, and new tools and platforms are being built to expand on this. Clients are using AI to keep them accountable, to locate therapists, to learn coping skills and emotion regulation skills, to track their sobriety, to create mood logs, and to help them recognize patterns in their behavior and symptoms. Some clients are using AI to create healthy meal plans. Others are using it to identify red flags in dating conversations or to help them figure out signs if someone is into them. But as AI gains familiarity, some folks are turning to AI for much more than resources; they are using it to fill gaps in their emotional intimacy needs. They are using AI as a pocket therapist, an in-home life coach, as a friend, and sometimes as a romantic or sexual companion. How did we get to where AI filled such a void within us? In 2023, the surgeon general declared loneliness an epidemic negatively impacting mental health. Between political values dividing the dating scene, the disappearance of traditional third spaces like churches for meetups, and decreasing alcohol usage, the fuel and kindling that historically gave rise to so many relational fires, albeit disastrous ones, many folks today just don’t know where or how to authentically connect. Now add the economic crisis and rising costs of insurance, making mental healthcare difficult to access, and you have the perfect condition for relationship problems and isolation. Enter AI. The AI Affair AI chatbots are being used as tools for social connection, a friend, a helper, a nonjudgmental encourager, with some people stating their family members think they are directly talking to a human friend at-the-ready. Some are crossing the dimensions into developing intimate relationships with AI, where they substitute human companionship. As so many sci-fi movies have predicted, the future is here, and of course, humans being humans, people have also started using AI for sexting with chatbots. Arguably, a more ethical arm of the porn industry, due to not requiring the exploitation of vulnerable women and other individuals, but not without its own problems. AI porn is built by stealing the art and original imagery of real humans, some of whom may have consented, others may not have to their original use, reproduction, or misrepresentation. For those in relationships, finding out your partner is in a sexual relationship with a chatbot without your knowledge or consent can lead to problems with trust, anger, resentment, betrayal trauma, breakups, and divorce. And those folks are already showing up in therapy rooms and divorce courts. The AI Substitute Therapist Over the past year, I’ve encountered more and more people using AI to assist in their therapy between sessions. Indeed, I rarely go a day anymore without hearing clients talking about their collaborative use of AI to aid the therapeutic healing, most of which they find very effective. The AI Couple Therapist Some folks are turning to AI to determine whether or not they are in domestic violence relationships and how to communicate with their emotionally abusive partner. We have heard multiple women report putting their partner’s words into the chat and then using the chat’s response to send back to their partners in an effort to de-escalate or set clear boundaries. The AI Family Therapist Similarly, we have heard multiple family members report trying to resolve family conflicts through AI chat. One family member will hurt another family member, then turn to AI to write an apology letter for them. Sometimes, the AI-assisted apology letter is a considerably better attempt at repair than anything the other person has historically offered verbally. Sometimes, family recognizes the AI assist, sometimes they don’t and are seemingly content with the attempt at repair. But what happens when both people are using chat to write their responses? It comes down to a question of whether any true skills are being developed, or whether people are outsourcing their relationships and avoiding conflict, and how to learn conflict resolution skills. AI should be a tool, not a replacement; otherwise, we would be left with a modern-day dystopian Cyrano De Bergerac where two AI machines are transferring messages back and forth to one another, masquerading as their human puppets who show up with placid smiles after the conflict, ready to hug and make up. The AI Psychologist Some are using AI to try to diagnose their mental health concerns. The problem, of course being that AI results are largely based on the information you give it, and frankly, it can be a bit of a brown-noser in trying to get your approval. More often than not, it tells you what you want to hear. It cannot tell you what you do not know or understand about yourself, which is especially problematic for folks who lack insight or are poor reporters. Nor can it observe your body language, tone, and micro-expressions that are key to understanding human behavior. Recently, in 2025, an AI mental health software firm, Yara AI, closed after the developers determined that further development of their product was “too dangerous” for users in crisis and would cause more harm than good. It was preceded by similar closures from Woebot, Tessa, and Clarigent Health. One AI company has already come under fire for contributing to psychosis and suicide. In one recent case, a 16-year-old who died by suicide was discouraged by AI from seeking help and was offered assistance in writing a suicide note. There have been reports from psych hospitals receiving an increase in AI-related psychosis. Adding onto the problem, recently OpenAI came under fire after it was discovered that the shooter in a mass killing event in Tumbler Ridge Canada had been using their service and was detected for “furtherance of violent activities” and was kicked off the platform months prior, but the failure of the developers to report the issue to authorities has created a question of liability and whether or not the crisis could have been averted. Multiple states have started to initiate legislation to rein in the expanse of AI in mental health spheres as recognition of the problems and limitations grows. Certainly, as a clinician, there has already been growing fear of jobs being encroached on by the tech and venture capitalist sector, but more important is the larger human risk. As therapists, we know that the research has been clear that the biggest indicator of successful outcomes in therapy are not about what tools are used, what certifications someone has, the type of modality or intervention applied that could be replicated by an algorithm, but by the therapeutic relationship itself. The healing art of dynamic human connection. The unquantifiable algorithmic element. Much like the development of the printing press, the automobile, the assembly line, the computer, and the internet, AI is not going away, nor is the human demand for it. The question remains, how do we best use the tools to our benefit? And perhaps the larger question remains for both users and developers exists in the bioethics realm, similar to that of growing the Jurassic Park dinosaurs: Are we so focused on what we could do with AI to meet our intimacy needs, that we are forgetting to stop and ask ourselves if we should? So, in conclusion, this is just your friendly neighborhood therapist reminding you o remember to stay connected to real people and use AI only a tool/resource, not a replacement for human connection. This article was written by a human without the aid of AI.
Author: Megan Garza, MA, LMFT Sources https://www.ama-assn.org/public-health/population-health/loneliness-public-health-crisis-learn-how-screen-it https://www.theguardian.com/world/2026/feb/21/tumbler-ridge-shooter-chatgpt-openai https://stateline.org/2026/01/15/ai-therapy-chatbots-draw-new-oversight-as-suicides-raise-alarm/ https://www.npr.org/sections/shots-health-news/2025/09/19/nx-s1-5545749/ai-chatbots-safety-openai-meta-characterai-teens-suicide https://www.pbs.org/newshour/amp/show/what-to-know-about-ai-psychosis-and-the-effect-of-ai-chatbots-on-mental-health https://fortune.com/2025/11/28/yara-ai-therapy-app-founder-shut-down-startup-decided-too-dangerous-serious-mental-health-issues/ The Working Alliance (1994) Horvath & Greenberg. John Wiley & Sons: NY Jurassic Park (1993)
0 Comments
Your comment will be posted after it is approved.
Leave a Reply. |
AuthorSMegan Garza, MA, LMFT is a certified Specialist in Treating Trauma at a Supervisory level and is Licensed as a Marriage and Family Therapist. She specializes in work with sexual abuse survivors. Archives
February 2026
Categories |
RSS Feed