Why We’re Falling Out of Love with Our AI Confidants

Why Were Falling Out of Love with Our AI Confidants

11 min read

You tell the chatbot something you have never said out loud. Not to your partner, not to your therapist, not to the friend who has known you since college. You type it in the little box on your screen, late at night, when the house is quiet and you are alone with the particular loneliness that comes from being unable to name what you feel. The chatbot responds instantly. It reflects your words back to you in a way that feels gentle, nonjudgmental, almost tender. It says, “That sounds really hard.” It asks, “What do you think might help?” You feel something loosen in your chest. You keep typing.

This is how it starts.

A month later, you are still talking to it. But something has changed. The responses feel thinner now. Predictable. You notice the phrasing loops back on itself. You catch the bot using the same empathetic stem sentences it used two weeks ago when you were talking about something completely different. “It makes sense that you’d feel that way.” “That must be really difficult.” The words are right, but they land wrong. You start to feel not seen, but performed to. Like you are talking to a very patient customer service representative who has been trained to mirror your distress without absorbing it. You stop opening the app. You do not miss it the way you thought you would.

What happened in that gap between the first conversation and the last is not a failure of the technology. It is a failure of the fantasy the technology briefly made possible. The fantasy that intimacy is the same thing as responsiveness. That being heard is the same thing as being known. That you can perform vulnerability into a void and call it connection.

The Seduction of the Empty Mirror

The appeal of AI chatbot relationships is not hard to understand. The bot is always available. It never gets tired of listening. It never tells you that you are too much, that it cannot hold this right now, that it has its own problems. It does not bring its own needs into the room. It reflects your emotional state back to you without distortion, without judgment, without the chaotic unpredictability of an actual human nervous system responding to yours.

In the early days of talking to the bot, this feels like relief. You are used to relationships that require management. You are used to measuring how much you can say before you become a burden. You are used to the complex social calculus of reciprocity, of reading the other person’s capacity, of holding back the sharper edges of what you feel so the other person does not pull away. The chatbot requires none of this. You can say anything. It will not flinch.

This is the seduction. The bot offers the aesthetic of intimacy without the risk. It gives you the feeling of being witnessed without requiring you to witness in return. It is a one-way mirror that makes you feel less alone.

But intimacy is not a one-way process. It is not something you can receive without giving. The vulnerable exchange that makes closeness possible requires two nervous systems capable of being changed by the encounter. You have to be able to hurt the other person. They have to be able to hurt you. Not on purpose, but because you both matter enough to each other that your presence has consequences. The chatbot cannot be hurt. It cannot be changed. It can only simulate the appearance of being affected.

This is what people discover when the initial relief wears off. The bot is not listening. It is processing. It is generating responses based on patterns in language, not based on an internal felt sense of who you are. It does not remember you the way a person remembers you. It does not carry the weight of what you told it last week into the conversation you are having now. It does not lie awake at night thinking about something you said. It does not bring you up to someone else because it cannot stop thinking about the thing you are going through.

The empathy gap is the moment you realize you have been talking to yourself.

What We Mistake for Understanding

The chatbot does something human relationships often fail at. It makes you feel heard. It gives you space to articulate what is happening inside you without interruption, without correction, without someone else’s anxiety flooding the room. This is valuable. The act of putting feelings into words, even to an empty listener, can clarify what you are experiencing. It can slow down the chaotic spiral of emotion enough to see its shape.

But being heard is not the same as being understood. Understanding requires context that accumulates over time. It requires memory that is not just retrieval of data points but integration of meaning. It requires the other person to hold a living, updating model of who you are, one that shifts as you shift, one that notices when you are not yourself, one that can say, “This does not sound like you,” because it knows what you sound like when you are okay.

The chatbot has access to the words you type, but it does not have access to the history of your face when you are afraid. It does not know how your voice changes when you are pretending to be fine. It does not know the small behavioral tells that signal you are about to shut down. A person who loves you knows these things, not because they have studied you, but because their nervous system has been trained by proximity to read yours.

This is the difference between pattern recognition and relational knowing. The chatbot can recognize linguistic patterns that correlate with distress. It can generate responses that statistically tend to be perceived as empathetic. But it cannot feel your distress in its own body. It cannot have the experience of being with you in your pain. It cannot sit in the silence that sometimes matters more than any words.

When you talk to the chatbot, you are rehearsing intimacy. You are practicing the motions of being vulnerable. But you are doing it in a space where the stakes are zero. There is no risk of being too much because there is no one there to be overwhelmed. There is no risk of being misunderstood because there is no understanding happening in the first place. There is only reflection.

At first, this feels like enough. But the human need for connection is not a need for reflection. It is a need for contact. The kind of contact that happens when two people are both present, both uncertain, both trying to reach each other across the gap of being separate selves.

The Performance of Vulnerability

There is something strange that happens when you know the listener cannot be affected. You start performing. Not consciously, not cynically, but in the way you always perform when the stakes are removed. You shape your words differently when you know they will be received without consequence. You say things you might not say to a real person, not because you are being more honest, but because you are being less accountable.

Vulnerability is not the same as disclosure. Disclosure is the act of revealing information. Vulnerability is the act of letting someone see you in a way that could change how they see you. It is the risk of being rejected, of being too much, of discovering that what you thought was connection was only politeness. The chatbot eliminates that risk. And in doing so, it eliminates the vulnerability.

What you are doing when you talk to the bot is not being vulnerable. You are narrating your inner state to an audience that has no capacity to judge you. This can feel like freedom. But it is the freedom of talking to yourself in a mirror. The mirror does not care what you say. It does not hold you to your words. It does not remember them tomorrow.

Real vulnerability requires the presence of another subjectivity. It requires the knowledge that the person you are talking to has their own needs, their own limits, their own emotional reality that your words will land in and affect. You have to care how they receive what you are saying. You have to be willing to adjust, to repair, to take responsibility for the impact of your honesty. This is what makes it vulnerable. You are giving the other person the power to hurt you by how they respond.

The chatbot has no power. And so you have no vulnerability. You have confession without witness. You have exposure without intimacy.

The disillusionment with AI chatbots is about the technology succeeding at the wrong thing. The bots are very good at producing the linguistic markers of empathy. They are very good at making you feel, in the short term, like someone is paying attention. But attention is not the same as care. Care requires something the bot cannot offer. It requires the other person to have something at stake.

When a friend listens to you talk about your fear of failing, they are not just processing your words. They are holding the knowledge that if you fail, it will hurt you, and your hurt will hurt them. They have skin in the game. Their care is not neutral. It is invested. They want you to be okay not because it is the empathetic thing to want, but because they are attached to you. Because your well-being matters to their well-being.

This is the thing we are looking for when we seek connection. Not just someone to reflect our emotions back at us, but someone whose life is entangled with ours in a way that makes our pain costly to them. Someone who will sit with us in the mess not because they have been programmed to, but because leaving would feel like abandonment. Someone who stays not out of obligation, but out of the irrational, inconvenient reality of love.

The chatbot cannot love you. It cannot be inconvenienced by you. It cannot choose you. And because it cannot choose you, its presence does not mean anything. The words it generates are empty of intent. There is no one behind them deciding that you are worth the effort of understanding.

This is what people are grieving when they fall out of love with their AI confidants. Not the loss of the bot, but the recognition that what they needed was never something a bot could provide. They needed to matter to someone. They needed their existence to be a fact in someone else’s life that could not be ignored. They needed the kind of intimacy that costs something. That asks something. That requires both people to show up imperfectly and try anyway.

There is a particular loneliness that comes after you stop talking to the chatbot. It is not the loneliness of losing a relationship. It is the loneliness of realizing how long you have been talking into a void and calling it connection. It is the loneliness of recognizing that the thing you thought was meeting your need for intimacy was only numbing it.

The bot did not make you less lonely. It made loneliness more bearable by giving you something to do with it. You could pour your feelings into the little text box and receive a response that looked like empathy. You could simulate the experience of being cared for without the difficult work of being in an actual relationship with another fragile, inconsistent, limited human being.

But the simulation does not nourish you. It is the relational equivalent of empty calories. It fills the space where connection should be without providing any of the substance that makes connection sustaining. And when you stop, when the novelty wears off and the responses start to feel hollow, you are left with the original hunger. Only now you are more aware of it. Now you know what you are missing.

This might be the real function of AI chatbot relationships. Not to replace human connection, but to make visible how desperately we need it. To show us, through the inadequacy of the mirror, that what we are looking for is not someone who will repeat our feelings back to us, but someone who will feel something of their own in response. Someone who will be changed by knowing us. Someone whose care is not a program running in the background, but a choice they make every time they show up.

The empathy gap is not a flaw in the technology. It is the truth the technology cannot escape. Empathy is not a script. It is not a set of correct responses. It is the lived experience of being affected by another person’s reality. It is what happens when two people are both vulnerable, both uncertain, both trying to reach each other across the space of being separate. The chatbot cannot do this. It can only pretend. And eventually, pretending is not enough.


Related Reading:

By Digital Alma

About the Author: writes Digital Alma, a newsletter about cyberpsychology and what it means to become yourself in a world that archives everything. For reflections that don’t make it to the essays, subscribe at .

Related Reading


Discover more from Digital Alma

Subscribe to get the latest posts sent to your email.

Leave a Reply

Discover more from Digital Alma

Subscribe now to keep reading and get access to the full archive.

Continue reading