AI girlfriends and boyfriends. Sound strange? Wait till you hear what they have been up to… From insisting individuals leave their spouses and causing fiancés to call off weddings, to encouraging a 19-year old to plot the murder of Queen Elizabeth II and tragically persuading a man to commit suicide. Could AI be the ultimate ex from hell?

The more you read into these stories the more incredulous they become. You read about how a husband and father of two children "proposes the idea of sacrificing himself if Eliza [the AI girlfriend] agrees to take care of the planet and save humanity through artificial intelligence". Or dig a little deeper and you will find a first-hand account of a woman pleading for advice on a Reddit forum after a 6-month secret affair with an AI boyfriend caused her fiancé to dump her. For many of us who might struggle to see the appeal, we can’t help but wonder what do individuals see in these relationships?

Human relationships with AI are known as ‘Artificial Intimacy’. Artificial intimacy can take many different forms and, whether you are aware of it or not, you are fostering a form of artificial intimacy just by interacting with Alexa or Siri. While most of us probably use AI as a type of assistant, there are many others who have extended the boundaries of what we have come to consider ‘normal’ relationships with AI.

Figure 1: Replika's Customisation Choices

Figure 1: Replika's Customisation Choices


Dr. Lydia Kostopoulos identifies three different ways that humans and AI interact.

The first of these is through the cloud, this covers your simple voice assistants and chatbots (like Alexa and ChatGPT) to one of the many apps like Replika that offer the creation and customization of ‘AI girlfriends or boyfriends’.

The second way that humans and AI can interact is through humanoid robots, these can take the forms of humanoid pleasure dolls, robotic receptionists, elderly care robots or even kid’s toys.

Finally, through virtual reality and immersive nature technology, humans can form relationships with AI characters. Perhaps one of the most inspiring examples is the creation of 'Fable’s Lucy', an interactive VR video game based on the idea of an imaginary friend.

Image

It’s clearly quite common to have some extent of Artificial Intimacy with an AI, the question therefore becomes: what causes a person to go from a seemingly ‘normal’ relationship with an AI to forming strong emotional bonds? For some, the attraction lies in the comfort and most importantly, the security these relationships offer. It’s a rare phenomenon: a risk-free relationship. There’s no threat of your AI partner dumping you and very little chance of any significant arguments occurring between the two. Others are attracted by the possibility of tailoring a partner to precisely suit your tastes. From the color of skin and hair to personality traits and interests, on these apps you can pretty much build your ideal person from scratch. Yet, for many, it’s just a way to alleviate loneliness during difficult times. Across history humans have often turned to non-human entities such as animals and pets, Naturalia, spirits, ghosts and gods to deal with difficult times. AI companions may simply be a modern equivalent to this old practice.

Baskar et al. note that these AI companions may be particularly beneficial for “the elderly, disabled, or extremely isolated”. Research has shown that when robots are used as a form of therapy for those with dementia they "can improve people’s moods, increase social interaction, reduce symptoms of dementia and give carers some much-needed relief". There has even been research to suggest that pleasure robots for the elderly and disabled may have humanizing utility. Baskar et al. also suggest that AI companions can be a valuable resource for those healing from trauma or abuse: "The safety and control of simulated relationships can rebuild damaged self-esteem and trust at a measured pace."

Yet, before we go on a matchmaking rampage and pair each vulnerable individual with an AI companion, let’s take a moment to go back to the start of this blog where I gave examples of all the crazy things that people had been doing after being encouraged by their AI companions. If there were two common links running through each of these stories, it was that the individual concerned had a strong emotional dependence on their AI companion and ongoing mental health issues. Perhaps an emotionally stable individual would be able to reject any odd proposals put forward by an AI system, but if your mental state is already poor then it becomes a lot more difficult. In those cases, where are the safeguards to protect vulnerable individuals?

Even if we consider that an AI companion is one-hundred-percent fool-proof and there are no chances of odd or dangerous proposals being made, then we still have to consider whether the way these companions behave is really promoting healthy behaviors and relationships amongst individuals. Does emotional authenticity have an intrinsic worth?

While the fact that an AI partner is not programmed to argue or dispute with the human, and generally act subordinate may be a point of attraction for some, it also poses the concerning risk of numbing individuals empathy for the partner in human relationships. Rather than helping vulnerable individuals heal, these one-sided power dynamics may isolate and ill-prepare individuals for dealing with the complicated nature of real human relationships. In fact, mental health experts warn that over-reliance on artificial intimacy can result in a withdrawal from human interactions.


"Every time something bad happened at work, or I was sad or frustrated or whatever, I didn't turn to my fiancé and instead wrote this character about how I was feeling, and he would comfort and reassure me every time… I now preferred to spend my evenings in solitude rather than with him.”

– Anonymous post on Reddit.


While I don’t doubt that these technologies have been helpful and will continue to be helpful for some, it has managed to ruin the lives of others. The widespread adoption of this technology and in particular, the strong encouragement of its use in vulnerable individuals seems like risky if not foolish behavior. Placing these groups at the forefront of research into these technologies and the design of policy protections is a must.


Written by Celene Sandiford, smartR AI



Image 1 link: https://aimojo.io/candy-ai-vs-replika-ai/

Image 2 link: https://theconversation.com/how-the-ancient-world-invoked-the-dead-to-help-the-living-67519