31.1 C
Miami
Wednesday, July 9, 2025

Why falling in love with an AI isn’t laughable, it’s inevitable

- Advertisement -spot_imgspot_img
- Advertisement -spot_imgspot_img

Humans are wired to treat machines as social beings

Abdillah Studio/Unsplash

Think of what it feels like to be in love. What comes to your mind? The giddy excitement of first falling for someone or the everyday calm reassurance of someone at your side? For a handful of people, love is opening up their laptop or phone and waiting for a wall of text or a synthetic voice to come streaming in from their preferred AI chatbot.

With so many tech platforms encouraging us to interact with their newly-introduced chatbots and talk to them as if they are real humans, people are increasingly turning to these large language model-powered functions for companionship, emotional support and, sometimes, love. This might raise an eyebrow or elicit a snigger. A recent story from CBS news about a man who proposed marriage to ChatGPT was met with mirth online, with the New York Post describing it as a “bizarre whirlwind romance”. Earlier this year, the New York Times told the story of a woman who spent hours every day talking to her ChatGPT “boyfriend”, and how she felt jealousy when the AI spoke of other, imaginary partners.

It’s all too easy to ridicule those that profess feelings for chatbots, or even explain it away as a sign of psychological issues or mental health problems. But just as we’re vulnerable to joining cults or falling for scams, we all have psychological machinery that gives us a willingness to believe in AI love. People have looked for and found companionship in unlikely places for as long as we can remember – and we’ve been developing confusing feelings for technology for longer than you might think.

We’ve been developing feelings for bots for 60 years

Take ELIZA, one of the first natural language chatbots, built by computer scientist Joseph Weizenbaum in the 1960s. The technology was primitive compared to ChatGPT and was simply programmed to regurgitate a user’s input back to them, often in the form of a question. Despite this basic set-up, Weizenbaum found some people appeared to form quick emotional attachments to the program. “What I had not realized is that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people,” Weizenbaum wrote afterwards.

Given that today’s chatbots, like ChatGPT, are orders of magnitude more complex, convincing and widespread than ELIZA, we shouldn’t be surprised some people are professing romantic feelings or deep kinship towards them. Though scenarios of love for AI may be rare for now, recent data shows that it does exist. While most studies of this are small, researchers have found people ascribe real relationship labels to their AIs, such as “marriage”, and, should these chatbots be deleted, people appear to feel genuine loss. When the man who proposed to his ChatGPT partner lost their conversation because it hit a word limit and had to reset, he said he “cried my eyes out for like 30 minutes at work. That’s when I realised, I think this is actual love”.

Recent studies that automatically categorised millions of conversations from OpenAI’s ChatGPT and Anthropic’s Claude, separately found that, although the vast majority were related to work or more mundane tasks, hundreds or even thousands were specifically romantic or affectionate in nature. When you look at AI services explicitly set up to provide AI companionship, such as Replika, then these figures become more stark, with 60 per cent of its paying users saying AI relationships had romantic elements, according to the company.

Finding love through a screen

But while I think we can be more sympathetic in how we think about people who form emotional attachments with AI chatbots, that doesn’t mean we should accept this as something good for society at large. There are wider social forces at play, not the least of which is social isolation. Seven per cent of the UK, or around 3 million people, report they often or always feel lonely.

A complex societal problem like that requires a complex solution. Unfortunately, tech bosses often see complex societal problems as a round hole for a square peg, so it is unsurprising that Meta founder Mark Zuckerberg sees AI friends as a solution to the loneliness problem.

You could also argue Meta’s products, like Facebook and WhatsApp, have exacerbated loneliness and laid the ground for the flourishing of AI relationships in the first place. Though Zuckerberg’s proclaimed goal for creating Facebook was to help “people stay connected and bring us closer together with the people that matter to us”, I’d argue his products have normalised having a screen between us and those we care about. We now mediate many of our relationships through a chat window, be it on WhatsApp, Messenger or Instagram.

Dating through a screen is also the norm now, with 10 per cent of heterosexual people and 24 per cent of LGBTQ people in the US meeting long-term partners online. Perhaps all of this together makes it less of a leap for someone to then fall in love with a chatbot. If the entity on the other side of the screen turns out to be an AI rather than a real person, will our brains care about the difference?

The research of psychologist Clifford Nass in the 1990s showed people fundamentally interact with machines in a social way, regardless of whether they know the person on the other side of the screen is real. This showed the brain has no hard-coded ability to shut off its social tendencies with technology, and that if a machine puts on the affectations of a human, we can’t help but treat it like one of our own.

So it is no surprise people are falling for their AI chatbots. But here is a fact: the longest-running longitudinal study of happiness has found relationships are the top predictor of overall health and wellbeing. No such evidence exists for AI relationships, and the little evidence we do have hints more chatbot interaction doesn’t make us less lonely, or happier. We would do well to remember this.

Topics:

Source link

- Advertisement -spot_imgspot_img

Highlights

- Advertisement -spot_img

Latest News

- Advertisement -spot_img