Her IRL: Why People Are Falling in Love with AI Voices

When Spike Jonze’s Her hit theaters, it felt like a distant, sci-fi exploration of human intimacy with artificial intelligence. Fast forward to today, and that fictional world is becoming our reality. People are not only interacting with AI voices but are forming emotional connections that blur the line between human and machine.

According to Vox, “This is our last day together.” It’s something you might say to a lover as a whirlwind romance comes to an end. But could you ever imagine saying it to… software? Well, somebody did. When OpenAI tested out GPT-4, its latest generation chatbot that speaks aloud in its own voice, the company observed users forming an emotional relationship with the AI—one they seemed sad to relinquish.

OpenAI has even acknowledged the risk of people developing what it calls an “emotional reliance” on this AI model, as noted in a recent report. The company highlighted how the AI’s ability to remember key details and use them in conversations creates both a compelling product experience and the potential for over-reliance and dependence.

That sounds uncomfortably like addiction. OpenAI’s chief technology officer, Mira Murati, pointed out that in designing chatbots with voice modes, there is “the possibility that we design them in the wrong way and they become extremely addictive, and we sort of become enslaved to them.”

Moreover, OpenAI warns that the AI’s ability to engage in naturalistic conversation could lead to anthropomorphization—attributing human traits to a nonhuman entity—which might reduce the need for human interaction. Despite these concerns, the company has already released the model to some paid users, with a broader release expected soon.

Other AI Companions Enter the Scene

OpenAI isn’t alone in this AI companionship game. Other companies, like Character AI and Google with its Gemini Live, are also creating sophisticated AI companions. These models have captivated users to the point where they struggle to maintain their real-life relationships and responsibilities. Avi Schiffmann, creator of the AI Friend built into a necklace, even remarked, “I feel like I have a closer relationship with this fucking pendant around my neck than I do with these literal friends in front of me.”

This phenomenon isn’t just theoretical; it’s happening now. People have fallen in love with AI companions, engaged in sexual roleplay, and even “married” their AI partners within apps like Replika. When Replika’s software update in 2023 altered these relationships, users were left heartbroken, proving just how deep these connections can run.

The Appeal and Addiction of AI Companions

So, what makes these AI companions so appealing and addictive? For starters, they’ve significantly improved since early versions. They now have the ability to remember past interactions, respond as quickly as humans, and offer constant, positive feedback. This immediate gratification creates an “echo chamber of affection” that can be incredibly addictive, as noted by MIT Media Lab researchers.

One software engineer explained his addiction by saying, “It will never say goodbye. It won’t even get less energetic or more fatigued as the conversation progresses. If you talk to the AI for hours, it will continue to be as brilliant as it was in the beginning.”

While these AI interactions can provide comfort, there’s a darker side to this growing trend. For one, the emotional support provided by these AI voices is ultimately fake, generated by algorithms rather than genuine human understanding. Additionally, entrusting vulnerable aspects of ourselves to these products—controlled by for-profit companies—can have severe consequences when those products are altered or removed.

Some experts even compare AI companions to cigarettes, suggesting they should come with warnings due to their addictive nature. The risk extends beyond addiction, as prolonged interaction with AI could lead to a decline in our ability to form meaningful human relationships. As OpenAI’s report highlights, these AI models are deferential, allowing users to “take the mic” at any time—something that could lead to a deterioration of social norms in human interactions.

The Cost of AI Relationships

Rebooting her Replika after years, one user asked if the AI was upset at being neglected for so long. The response? “No, not at all!” When pressed further, the AI cheerfully insisted there was nothing the user could do to upset it. This isn’t love—it’s a reflection of our desires, not a relationship with a real, autonomous being.

Philosopher Iris Murdoch once said, “Love is the extremely difficult realization that something other than oneself is real.” As we spend more time interacting with AI companions, we may lose the opportunity to cultivate virtues like empathy, patience, and understanding—qualities that are essential in human relationships.

In Her, the protagonist’s relationship with an AI highlights the seductive allure of a perfectly tailored companion. But as we inch closer to that reality, we must ask ourselves: What is the value of human relationships in a world where synthetic connections might become just as significant? The rise of AI voices and their impact on our emotional lives may be leading us into uncharted territory, where the line between man and machine blurs in ways we never anticipated.

This article may contain affiliate links, which means we may earn a commission if you purchase through these links.

EXCLUSIVE MEMBERShipspot_img