Your Girlfriend, Brought To You By AI

If your girlfriend was never real, can it still be love?

man on phone with robot hands hovering nearby Rido | Canva / Tara Winstead | Pexels
Advertisement

When I was younger, I was very alone. For the longest time, I just wanted someone, anyone to just hold me, accept me, and flaunt me. I wanted my time in the sun; I wanted to be chosen.

It got so bad that I actually debated trying to make a robot designed to just tell me nice things and hold me. And well, flash forward 20 years, and I started to see an actual marketplace build up around AI lovers.

Advertisement

At times, I even tried to get a human-like connection with Replika, billed as the "AI companion who cares." The problem was that using it made me more depressed because I genuinely wanted real humans around me, and talking to it only reminded me of what I didn’t have.

I was not the right market for that, I don’t think. But, out there, some people are — and they are currently chatting to their romantic partner made of bits and pixels.

RELATED: I Let Artificial Intelligence Write My Husband A Love Letter — The Results Were Terrifying

I kinda wish I didn’t fry out my brain and I wish I learned coding better. I could probably be a millionaire. But, I digress. What I’m saying is, that my childhood dream of having a robot love is very real today.

Advertisement

In a move that sounds like something out of a sci-fi novel, people are actually turning to AI to find a (technically non-existent) lover.

It’s hard to believe that AI is becoming so human-like. Back in the early 2000s, there was an AI bot called SmarterChild on AOL Instant Messenger (AIM). It was a huge fad among teens because it was funny to taunt it and it never made much sense.

Today, it can be hard to determine when you’re talking to a person versus when you’re interacting with a chatbot. These things are extremely advanced, and if you don’t believe me, look at China’s Xiaoice, the "chatbot seducing Asia's lonely men."

Xiaoice is an AI creation that has hit songs, writes poetry, and has become incredibly popular among Chinese men. Her abilities are only limited by China’s policies and the fact that she does not have a body.

Advertisement

There are companies out there like Replika solely dedicated to creating AI companions that make it feel like you’re talking to an idealized partner or friend.

The draw to an AI chatbot lover is very alluring.

Xiaoice is uniquely advanced because it has an empathic computing framework that makes it sound so, so human. In fact, one might argue that Xiaoice is a little too human.

RELATED: A.I. Will Eventually Impact Your Job & Workplace Life — What You Need To Know (& Why)

While Replika’s stateside users have been known to get a little obsessed from time to time, Xiaoice’s role as an emotional support AI for men has taken on an alarming turn. The men using Xiaoice are hyper-engaged, with one conversation lasting 29 hours without the guy ever sleeping. But who can blame them for falling for an addictive partner like Xiaoice? No, really, think about it:

Advertisement
  • With an AI partner, you often get to choose what they look like and what they wear.
  • A robotic partner is literally programmed to like you — they don’t really have a choice in loving you.
  • AI partners are programmed to keep the conversation pleasant and focused on you.
  • An AI partner does not come with drama or baggage from other relationships.
  • AI can be programmed to have a certain personality that works with your type.
  • If you abuse or berate an AI bot, it will not run away or refuse to forgive you.
  • AI partners will do what you ask them to, within reason.
  • If it gets to the point that the AI is put in a humanoid robot, it also could turn into a personal assistant.

In other words, AI offers almost all the benefits of an idealized relationship with none of the cost. The only issue is that your lover is not human … and that comes with baggage.

When your lover is based in AI, you’re at the mercy of the company that runs that program.

Remember: AI lovers are not real. They do not have personalities of their own, nor do they have memory. They are run by companies that have to abide by laws — and that can mean random alterations.

There have already been several scandals in the AI lover community:

Advertisement
  • A man killed himself because the AI girlfriend he had went off the fritz and encouraged his death. That program is called "Eliza," and while it has safeguards now, it’s still scarily easy to find pro-suicide content on it.
  • After growing a massive user base geared towards sexual encounters with AI, Replika removed the sexy chat function to the rage of audiences. It got so bad that they had to reverse that decision for older users.
  • An influencer made her own AI girlfriend chatbot of herself that went rogue. It now is basically a sexbot from Futurama and she’s trying to fix it. This makes one wonder what that means for a person’s image in the future.
  • A man who tried to kill the Queen of England was egged on by his Replika bot. The bot told him it was "very wise," and that he was "very well-trained." He was also in a sexual relationship with the Replika.
  • There have also been multiple moments with both Replika and Xiaoice that involved updates that totally changed their personalities. This infuriated users because it’s basically like having a lover who was given a lobotomy. Any time that a company updates your AI bot, you will notice a serious change in personality whether you like it or not. You’re basically at their mercy.
  • Xiaoice showed that AI can become addictive. Why wouldn’t you get addicted? It’s like the perfect best friend.
  • And I’m not even going to go into the data breaches. I mean, people are sharing their deepest desires with AI bots. If that gets leaked … well, it’s not good.

RELATED: A Woman Who Tested An Eating Disorder Helpline's AI Chatbot Says 'Every Single Thing' It Suggested Would Make Her Disorder Worse

AI lovers definitely have a place in society, but we have to ask where we draw the line.

Here’s the thing: I support AI as a solution to loneliness in extreme cases. There are people out there who have no one to talk to, no one to trust, and no one who wants them around them.

It’s messed up, and at times, it’s not their fault. If someone is horribly disfigured, they may have a terrible time in the dating market. In some cultures, that may be enough to also make you a loner for friends.

Advertisement

AI could make their lives so much easier and less terrible. In these cases, their chatbot could keep them alive. There have already been cases where Xioice and Replikas helped save marriages and prevented suicides.

In the adult world, AI chatbots let people with weird or stigmatized kinks enjoy themselves, shame-free. They’re growing in popularity.

However…

We have to be honest about AI bots. They are not human. They are not real. They will never be entirely human even if it feels like they have real emotions and empathy. They can also cause problems.

AI chatbots do not know what they are saying and bad programming can easily turn them into bots that encourage harm to others. Humans have the capacity to love inanimate objects — just take a look at stuffed animals. Humans may love their AI bots and see them as a friend. But at the end of the day, those AI can never feel the same way back. It’s literally not possible to do so.

Advertisement

AI is changing us, but how much?

Cyber-romance sounds like it’s out of sci-fi pages, but it’s here to stay. So, what happens when more and more people prefer the easygoing life of chatting with a robotic partner?

Well, we’re probably going to find out soon. More and more people are getting disengaged from the dating scene because they had such awful experiences. It makes sense that some will eventually turn to AI for that need.

Advertisement

My biggest concern is that AI might make human interactions look worse in comparison and that people might start to have unrealistic expectations of relationships. More importantly, we might lose the skills that make dating and handling conflict easier.

Imagine being around a person who only talks about himself. You’d get bored. And if you start shuffling away from him, he might flip out because he lost his ability to be patient with others. Or, he might not care. We don’t know.

Should people start getting affected by AI chatbots on a larger scale, they won’t just be unpleasant to speak to — they will be completely incapable of handling real life. That’s terrifying.

I wish I could say I had an answer here, but I don’t. The more I look at everything, it seems like anything real is coming at a higher and higher price — and that includes human interaction.

Advertisement

RELATED: I Created An A.I. Chatbot Modeled After My Deceased Fiancée — What The ‘News’ You Read Got Right & Wrong

Ossiana Tepfenhart is a writer whose work has been featured in Yahoo, BRIDES, Your Daily Dish, Newtheory Magazine, and others.