A sci-fi meange a trois - pourquoi (pas)?

“A cyberpunk aesthetics view”, where technology, body and our deepest wishes melt together in a sci-fi menage a trois? Or should we critique a mechanized sexuality and relationship orientation in a digital age? Are we “robomantics”? Or are robots simply robbing us of romantics?

Solitude is not loneliness, loneliness is not solitude. Solitude is intimacy. The kind of intimacy you have with yourself. It's the antidote to loneliness, because being secure in your own company, you move with ease, and meet people with a comfortable breeze. Loneliness on the other hand, fills you up, takes up space inside of you, and the aura around you. This odour, this particular smell, your incense of desperation, is the opposite of the breeze. People are afraid you're smell is contagious. It becomes harder and harder to reach out, to erade the smell of loneliness. So how to empty your loneliness? If loneliness begets loneliness, could robots be the transference objects that gives hope for a corrective emotional experience, making us a bit braver to reach out and (re-)connect with each other?

An emotional corrective experience in psychotherapy occurs when a client re-experiences a past emotional wound in a new, healing way within the therapeutic relationship. Unlike the original painful experience, which may have involved rejection, neglect, or harm, the therapist provides a supportive, validating, and responsive environment. This allows the client to process emotions differently, challenge old patterns, and develop healthier ways of relating to themselves and others. A transference object in psychotherapy is a person, often the therapist, onto whom the client unconsciously projects feelings, expectations, or relational patterns from past significant relationships. This can help reveal unresolved emotions and conflicts, allowing for deeper insight and healing within the therapeutic process.

Could a robot be our transference object? Or would it rather rob us of the willingness, not just the ability, to connect with other people, making us prefer a “robomantic relationship” ? 

What is love?

Baby, don't hurt me. Don't hurt me no more

What is love?

Baby, don't hurt me. Don't hurt me no more

What is love?

Baby, can't hurt me, no more” 

How we protect ourselves against intimacy through technology. In our effort to protect ourselves from rejection and emotional pain, we may unconsciously adopt robotic tendencies in relationships; we minimize vulnerability, suppressing deep emotions, and engage in surface-level interactions rather than risking true intimacy.

This can manifest as emotional detachment, rigid control over our expressions, or a preference for explicit transactional exchanges, where expectations and outcomes feel predictable and safe, ref. “I’m an emotional prostitute, so I might as well get paid for it”.

But by distancing ourselves from our own emotional complexity, we might also limit our capacity for deep connection. When we treat relationships like carefully managed systems, reducing uncertainty and spontaneity, we may avoid heartbreak, but we also forgo the richness of genuine human connection.

In a world that increasingly blurs the line between human and artificial interaction, this self-protective strategy can make relationships feel safer, yet ultimately more hollow. Ironically, the more we try to shield ourselves from pain, the more isolated we may become, as true intimacy requires the very things we fear: uncertainty, vulnerability, and the possibility of being hurt. Becoming more robot-like in relationships as a shield against the pain rejection doesn’t seem to be the answer.

Remember the Eliza-effect? Eliza was an early natural language processing chatbot developed in the 1960s by Joseph Weizenbaum at MIT. Designed to mimic a Rogerian therapist, Eliza used simple pattern-matching techniques to respond to users’ statements, often reflecting their words back to them. Rogerian therapy, or person-centered therapy, was developed by Carl Rogers in the 1940´s, and focused on empathy, active listening, and unconditional positive regard. Unlike more traditional analytical therapies of the time, which analyzed and diagnosed patients, Rogerian therapy emphasized allowing clients to explore their own feelings in a supportive environment.

Because Eliza used open-ended, non-judgmental responses, many users felt heard and emotionally validated, sometimes believing the program to be genuinely understanding. This reaction surprised Weizenbaum and sparked discussions about the psychological effects of AI, the illusion of an understanding machine, and the ethical implications of human-computer interaction.

So, does it matter? The mind and body adapt. If a robot or chatbot provides us with an experience that feels real enough, our nervous system can respond accordingly. One benefit of talking to AI, like Eliza and more advanced chatbots today, is that it provides a safe, judgment-free space for people to express their thoughts and emotions. AI can be useful for practicing conversations, exploring emotions, and gaining clarity on personal issues before discussing them with others. It can also serve as a tool for self-reflection, helping individuals build confidence in communication and emotional processing. Hence being an affective transference object.

If we feel seen, understood, and safe, the biological response can resemble that of a human relationship, even if we are fully aware that it is a simulation. So, if the most important thing is to feel loved, seen, or understood, and a chatbot can provide that, why would it be any less real than a relationship with a human? Well, if we believe that being in a relationship is about encountering another person with their own subjectivity, complexity, and boundaries, then a chatbot or robot can never fully replace a human connection. But does that matter? Maybe the answer depends on what we seek in relationships.

Isn't it all a projection anyway? Projection in love occurs when we attribute our own feelings, desires and unresolved issues onto another person. This is the alchemy of falling in love; a person has to trigger our deepest wounds, but also give hope of resolution through fulfilment of our deepest longing for belonging. These unconscious perceptions are shaped by our own inner world, rather than by reality, because we haven't taken our time to actually get to know the other (I just want you to be what I want).

The idealized someone can immediately make us feel complete. But If we don't incorporate and inhabit those qualities that made us feel whole through the meeting of this “catalytic someone” in our selves, these qualities are only for rental. The rental will expire the moment you lose that significant catalyst, this rental of the other being (in) you.

I would argue that projecting onto a human being is different from projecting onto a robot or chatbot, or rather, has potential from taking us from falling in love to actual love. Because with other humans we have the possibility to interact with their complexity, their own inner world and subjectivity, their emotions, agency, awareness etc. A human partner can respond, challenge, or confront our projections, either consciously or unconsciously, which can lead to growth, conflict, or deeper connection - which is a prerequisite if you want to go from falling in love, to actually love someone beyond the projections. In contrast, a robot or chatbot, being programmed with limited responses, cannot provide the same emotional depth, challenge our projections in potentially painful, but necessary ways. While the robots might validate or mirror our feelings, it lacks the capacity to truly engage with or alter our projections, which can leave the interaction feeling shallow or incomplete - or rather, a lack of love.

Is It a Substitute or a Replacement? What about those of us who are unable to form relationships with other people? Not because there is anything inherently wrong with us, but because society is structured in a way that that makes disabilities, both visible and invisible, an exclusion criteria for participation. In that case, could AI be less of a substitute and more of a necessity?

Rejecting AI as a potentially relationally meaningful experience might be a privilege for those of us who feel we have the choice between AI and other humans. I would argue for a society that should become more inclusive, rather than more "AI-sive," but what are lonely people supposed to do in the meantime? 

An individualistic society prioritizes personal independence, self-reliance, and individual achievements over collective or community-oriented values. In such societies, people are encouraged to pursue their own goals, make their own choices, and take responsibility for their successes and failures. I vote for autonomy and relationships that are voluntary based on personal fulfillment, rather than duty or obligation forced upon us by patriarchal structures. But it has it’s downsides too; While individualism can foster innovation and personal freedom, it can also lead to social isolation and a weaker sense of communal support, particularly for those who struggle to fit into mainstream social structures.

So, the question is not so much whether AI is good or bad for you, whether we want a more inclusive or AI-sive society. But rather how can we make use of the technology in the most beneficial way, while we keep building a more inclusive society and challenge the poisonous parts of an individualistic driven society.

Forrige
Forrige

“På seg selv kjenner man ingen”

Neste
Neste

“Sexability”