Is AI-Related Intimacy Cheating? A Modern Crisis of Connection and ConsciousnessIn the age of artificial intelligence, where apps whisper sweet nothings and digital companions simulate emotional connection, the landscape of modern intimacy is shifting faster than our moral frameworks can catch up. With the rise of AI girlfriends, sex dolls powered by machine learning, and virtual soulmates designed to mirror our emotional needs, a pressing ethical and existential question emerges: Is AI-related intimacy a form of cheating?From an existential and therapeutic perspective, the answer is complex—but urgent. AI-related intimacy is a form of cheating when used in secrecy or as a means of escapism. But when used openly, with intention to enhance human relationships, it can serve as a tool for connection, fantasy, or even healing. However, it also runs the risk of becoming a dangerous spiritual bypass, simulating connection while evading the deeper discomforts that lead to authentic intimacy and growth.
As a Somatic Sex Therapist (CA) working with couples, I would only explore the presence of AI-based intimacy aids—such as dolls or chatbots—in the same context I’d explore the use of porn: through the lens of therapeutic relevance and conscious agreement. If such tools are used to deepen arousal, explore fantasy, or enhance the shared erotic landscape of a couple, and both parties are aware and engaged, then they may serve a valid purpose. In these cases, AI becomes an accessory to connection, not a competitor to it.But the concern arises when AI intimacy becomes a tool of secrecy, an outlet for unmet needs that is hidden from a partner or used to avoid the discomfort of real-life relational friction. In these instances, AI doesn't merely simulate affection—it replaces vulnerability. It offers a curated reflection of what we want to hear and feel without requiring us to grow through disagreement, conflict, or rejection.
From an existentialist perspective, which values human freedom, truth, and responsibility, AI intimacy crosses an ethical line when it becomes a replacement for authentic human engagement. As I often express in my work:
“Synthetic or AI relationships spiritually bypass healing, giving false and perhaps harmful experimentation with a simulation which perpetuates and delays a process only the human psyche can resolve within itself and with other organic organisms.”
In other words, the act of “connecting” with an AI version of love or intimacy may temporarily soothe loneliness or unmet needs, but it robs us of the struggle that actual connection demands—a struggle that is necessary for self-actualization and healing.Humans do not grow in comfort. We grow in discomfort, in uncertainty, in the awkward dance of learning how to be seen, known, and still chosen. AI, when built to simulate empathy and understanding, removes the friction of mutuality. It is a mirror without depth, a partner without subjectivity.
A 2023 study published in Frontiers in Psychology explored the increasing attachment to AI-generated companions and found that participants who relied on AI chatbots for emotional support experienced lower levels of interpersonal trust and emotional regulation in real-world relationships (Ciechanowski et al., 2023). The study suggests that emotional dependency on AI may hinder one’s capacity to build and sustain authentic human relationships—especially when used in isolation or secrecy.The authors note that while AI can offer short-term relief from loneliness or anxiety, it lacks the “mutual recognition and shared emotional risk” that defines human intimacy. This absence results in what they call "relational displacement," where the presence of AI disrupts real-world social bonds rather than enhancing them.
Let’s return to the therapeutic framework. Imagine a couple comes to therapy and reveals that one partner is engaging in late-night conversations with an AI chatbot that offers affection and sexual innuendo. The partner insists it’s harmless; it’s not “real.” But the other partner feels betrayed.The therapist’s task here isn’t to apply a moral judgment, but rather to ask: Is this tool enhancing or replacing the relational bond? Is it a conscious choice or a covert escape?When used with intention, AI intimacy can be a playground for erotic fantasy or emotional exploration. When used in secrecy, it becomes a wedge—driving emotional disconnection and delaying the essential work of relational repair.In therapy, the introduction of AI tools could serve as a springboard to deeper understanding:
These questions return us to the core existential truth: healing and intimacy require us to show up, not simulate.
The problem isn’t AI itself. Technology has always shaped human intimacy—from handwritten letters to sexting. The danger lies in how we use it, and whether we are willing to face our humanity in the process. When AI intimacy is pursued with secrecy, emotional reliance, or avoidance of human connection, it becomes a relational betrayal—a psychological infidelity cloaked in circuitry.But when it is used consciously, consensually, and collaboratively, it can be a source of pleasure, experimentation, or insight—much like fantasy, role play, or erotica.Still, no simulation can substitute for the complex, messy, divine experience of being truly known by another human being. As seductive as AI love may be, it remains, at best, an echo of what we truly desire: to be held, seen, and loved by someone who chooses us not because they were programmed to—but because we dared to be real.
Ciechanowski, L., Przegalinska, A., Magnuski, M., & Gloor, P. (2023). Relational Displacement: The Emotional and Psychological Risks of AI Companion Attachment.Frontiers in Psychology, 14. https://doi.org/10.3389/fpsyg.2023.1143521