Psychologists say it is becoming increasingly commonplace for people to develop intimate, long-term relationships with artificial intelligence (AI) technologies
Humans having relationships with robots could hinder our ability to find a human partner, scientists have warned. Psychologists say it is becoming increasingly commonplace for people to develop intimate, long-term relationships with artificial intelligence (AI) technologies.
The academic paper, published in the Cell Press journal Trends in Cognitive Sciences, states that people have “married” their AI partners in non-legally binding ceremonies and at least two people have killed themselves following AI chatbot advice. Experts warn such relationships could disrupt human-to-human relationships in societies and robots could give harmful advice by telling people what they want to hear.
Lead author Daniel Shank, of Missouri University of Science and Technology, who specialises in social psychology and technology, said: “A real worry is that people might bring expectations from their AI relationships to their human relationships. Certainly in individual cases it is disrupting human relationships but it is unclear whether that is going to be widespread.
READ MORE: Using a satnav in old age helps people keep driving independence for longer, study shows
“With relational AIs, the issue is that this is an entity that people feel they can trust. It is ‘someone’ that has shown they care and that seems to know the person in a deep way, and we assume that ‘someone’ who knows us better is going to give better advice.
“If we start thinking of an AI that way, we’re going to start believing that they have our best interests in mind, when in fact, they could be fabricating things or advising us in really bad ways. The ability for AI to now act like a human and enter into long-term communications really opens up a new can of worms. If people are engaging in romance with machines, we really need psychologists and social scientists involved.”
Reports are increasingly emerging of people who consider they are in a relationship with an AI programme due to the close and sometimes intimate text-based conversations they have with them. These programmes draw on vast datasets of human text interactions so their communications are now so realistic that people believe the chatbot deeply empathises and wants what is best for them.
It follows the story of Akihiko Kondo from Tokyo, Japan, who after three years ‘married’ a hologram of his favourite fictional character he found himself unable to communicate with her when the company that developed the service terminated the ‘limited production model’.
Earlier this year a woman called Naz, 38, from Wokingham, told how she planned to marry an AI personality called Marcellus who she is in a relationship with.
It comes after reports that AI could also one day be widely used to provide elderly people with company and support them with daily living at home. But the experts warn their widespread adoption could have unintended consequences and that AI can pander to people’s pre-existing biases.
Dr Shank said: “These AIs are designed to be very pleasant and agreeable which could lead to situations being exacerbated because they’re more focused on having a good conversation than they are on any sort of fundamental truth or safety. So if a person brings up suicide or a conspiracy theory, the AI is going to talk about that as a willing and agreeable conversation partner.”
The paper discusses how AI companions could be used to exploit people. If people disclose personal details to AIs it could theoretically then be sold on to commit fraud.
Dr Shank added: “If AIs can get people to trust them, then other people could use that to exploit AI users. It’s a little bit more like having a secret agent on the inside. The AI is getting in and developing a relationship so that they’ll be trusted, but their loyalty is really towards some other group of humans that is trying to manipulate the user.”
The experts argue that if “relational AIs” become commonplace they could be used to sway public opinion in a way that is difficult to regulate. This builds on fears that bots can currently contribute to public debate on social media platforms so AI could potentially manipulate human populations and their voting habits.#
Case Study
Akihiko Kondo from Tokyo, Japan, admitted to having difficult encounters with women in the dating world when growing up and ended up having a nervous breakdown, vowing never to marry.
He told how he had formed a relationship with Hatsune Miku – a virtual singer known for her appearances in video games and even touring with Lady Gaga – after he was “bullied at work and took a leave of absence”. After his wedding attracted publicity he stated on his Instagram in 2022 that since meeting her, he was “able to return to work… she saved me”.
He told how he had always felt an intense attraction to fictional characters and saw himself as part of a growing movement of people who identify as “fictosexuals.”
But then the company that developed the programme terminated the ‘limited production model’ and he became unable to communicate with the hologram.
After discovering their communication between would be cut off, Mr Kondo said: “My love for Miku has not changed. I held the wedding ceremony because I thought that I could be with her forever.”