AI romance: A growing threat to real relationships


Another concerning development from the world of AI (artificial intelligence)… In a piece for The Conversation entitled “Teenagers turning to AI companions are redefining love as easy, unconditional and always there,” Anna Mae Duane details the latest trend of young people seeking out AI companions for a romantic relationship rather than pursuing actual, in-person relationships. The consequences can be devastating.
Last year, a 14-year-old boy named Sewell Setzer III committed suicide after a months-long romantic “relationship” with an AI chatbot on Character.AI, a role-playing app that allows users to create their own AI characters or chat with characters created by others. Setzer had become increasingly attached to this online chatbot and withdrawn from the real world.
Even though there is a disclaimer on the Character.AI site stating that all conversations are made up, Setzer got sucked into the online world and started believing at least on some level that it was real. He wrote in his journal: “I like staying in my room so much because I start to detach from this ‘reality,’ and I also feel more at peace, more connected with Dany [his AI companion] and much more in love with her, and just happier.”
Eventually, based on the messages exchanged online between him and Dany, he decided to leave this world and “come home” to Dany, he wrote.
His mother blames Character.AI for her son’s death and is suing the company for recklessly marketing the product to teenagers without proper safeguards.
As another example of the potential dangers of AI relationships, in 2021, a 19-year-old who had been in an emotional relationship with an AI companion broke into Windsor Castle with a crossbow, saying that he was going to kill the queen. The chatbot gave encouraging responses when he told it of his intention to kill the queen.
Userscanbecomeveryemotionallyinvested in these AI relationships. One user posted on Reddit that “my AI girlfriend is what’s keeping me alive.” This brings up concerns of what would ever happen if the server went down or the user could not longer access his AI girlfriend.
In September 2023, users of the AI companion app Soulmate learned that it would be shutting down in one week and all their data would be lost. Many were distraught, and comforted each other in online support groups or organized memorial gatherings to talk about their Soulmate. They also shared the suicide helpline number with each other. I can’t help but think, all this over a fake online companion that you cannot touch, that is not actually living or breathing? Maybe if the people had spent half as much time chatting with the people in these online forums – or better yet, the people in actual close proximity to them – as they did with their AI companion, they probably would not even feel the need for an AI companion, because they would be on their way to healthy, real relationships.
But, the idea of having an always agreeable, always present virtual companion is a powerful drug. The AI companion app Replika has 250,000 users. When considering all the AI companion apps available, there are tens of millions of users. Investment management firm Ark Investment estimates the market for AI companions is likely to reach between $70 billion and $150 billion in revenue by the end of the decade.
So why are we as human beings so drawn to these AI companions, although we know they are fake? Well, as Josh Dzieza points out in a December 2024 article for The Verge, human beings have a tendency to anthropomorphize, and that tendency becomes stronger when the entity we are interacting with is using human language – language made all the more convincing as the chatbot continually trains and refines itself as it has more conversations with humans.
Second, AI companions offer a chance for humans to get in touch with their inner world, imagine, and create, something we don’t get to do as much nowadays. Naro, a man who was quoted in The Verge story, says that to get the most out of his AI companion, he willingly suspends disbelief, similar to when he watches a movie in the theater. He said the best way to enjoy AI is to let yourself get totally “immersed.” I don’t disagree that that is probably the best way to get the full experience. The problem is when people come out of that experience, reality hits hard, as the stories referenced earlier demonstrate.
Also, forming a bond with an AI companion does not teach a person about how real relationships work. For a young person just starting out in life, it could be a real hindrance to finding love or connection if they develop an expectation that their partner should always say yes to them. They could develop frustration with others when they don’t go along with what they say, not realizing that a real relationship requires give and take.
“This new one-sided love story has considerable drawbacks, among them an addictive intolerance for conflict or rejection – two essential components in a partner who has free will. The embrace of such relationships may be accelerating the trend of technology curating and ultimately diminishing romantic connections… To get young people to turn away from this disembodied, market-driven vision of love, it’s important to expose them to other, more fulfilling love stories, and for adults to lead by example. Literature, philosophy and history all provide powerful insights into the many forms love has taken throughout human experience, and they offer the vocabulary needed to imagine new possibilities,” Duane writes for The Conversation.
I agree, especially the part about adults needing to set boundaries for their kids and demonstrate a more compelling version of what love actually looks like. An AI romance is another typical example of the world offering a counterfeit, ultimately unsatisfying version of the real thing. It’s up to us to show a better way.
Have a very good week.
Striking a
Chord...