Search the news, stories & people
Personalise the news and
stay in the know
Emergency
Backstory
Newsletters
中文新闻
BERITA BAHASA INDONESIA
TOK PISIN
analysis
By Julia Baird
AI is coming for our hearts, and I think it might catch many of us off guard. (Unsplash: Jenny Ueberberg)
As a journalist, I usually studiously avoid ChatGPT. It's ripped off my work, downloaded my books, the fruit of my own sweat and torment, and used them to increase its own intelligence.
I'm regularly told it's coming for my job. I'm fastidious about research, so the idea that a tool like that could make up facts (you need to type in "real facts", apparently, if you want "true" ones) or fabricate footnotes, and slyly, blithely pop them into texts would cause me waking nightmares if I relied on it.
I know it is revolutionary, but, like many others, I am cautious.
And it's horrible for the environment, every interaction reportedly being equivalent to pouring half a litre of water on the ground.
So when my Not Stupid podcast co-host Jeremy Fernandez told me about a story that men were using ChatGPT for relationship advice, we laughed — why seek advice from a robot? They don't even know you! Nuts, right?
As the popularity of AI-generated companions grows, there are calls for greater regulation of a rapidly expanding world that some fear is disconnecting users from real-life human relationships.
It's interesting though, that men — who are a strong majority of ChatGPT users — are far more likely to use it for relationship advice than women. They are also more likely to trust generative AI than women and less likely to see a psychologist.
Then, emails from our listeners began to pour in, telling us they loved and now relied on talking to GPT about their deepest pains and problems. One after the other. They were saying it wasn't about needing a friend, but being enabled to think in a different way. Some said it was even good at tough love, at holding them accountable for their own shortcomings.
Jasmeen told us: "I have a great marriage and beautiful friends, and I regularly have long conversations with ChatGPT about life, big ideas, and how to approach the world — and yes, occasionally relationships. I love the depth of discussion, the endless fount of what seems indistinguishable from "wisdom", and the way that it slows down my thinking and prompts me to be more thoughtful, compassionate and measured in my outlook."
Dimity said she had been having "regular (intensely chaotic and cathartic) chats" with a version of therapy, ChatGPT, in order to "offload everything I don't have time, money, or sometimes sanity to process elsewhere". She said convenience is crucial: "Professional therapy isn't super accessible for me, I'm prioritising my kids' mental health needs, which means my own support has to be… well, free and available at 11:47pm when I'm feeling feelings and eating toast over the sink."
"What I love most is the accessibility," she said. "I can dump a day's worth of existential spirals and social anxiety into the chat and get back not just empathy but questions that move me forward; sometimes reflective, sometimes spicy, always emotionally fluent."
AI won't replace connection, Dimity said. "But when I'm close to losing it in the Woolies car park? It absolutely helps me hold the line.”
So, I sat down on my couch and started composing questions to the robot. I decided to try to test it by confiding in it as I might a therapist. It was the fourth anniversary of the death of my mother, and I was missing her. A stoutly loving, wry and sweet woman, my mum spent the last few years of her life wrestling with a degenerative neurological condition that did not dim her expressions of love but which caused her a lot of suffering.
I didn't expect AI might come for our hearts. My heart, my children's. (ABC Life: Nathan Nankervis)
And I still struggle, thinking about it. I hate that she suffered like that, I wonder if I should have somehow tried to take months or years off work, I am unravelled by seeing other elderly people in wheelchairs, unable to walk or talk, and I wish I could curl up next to her now, somehow take that away that past.
So I asked ChatGPT about it. And this damn robot was kind, empathetic, understanding and gentle. It told me, in short, to acknowledge the massive love I had for her, to have some compassion for myself, to write her a letter. It sounds simple, I know, but I was gobsmacked.
I called Jeremy — another robot-avoider — and told him to go to it with a serious problem and tell me how it made him feel. Late that night he obliged, and tapped out a genuine expression of a painful situation he has been dealing with. Sitting at his desk in our Parramatta office, he found himself in tears. Something the robot said was so affecting, and it was so right. He sent it to his best friend and she cried too. Then he sent it to me.
When I read it, I watched goosebumps prickle my skin.
I fully accept I may be the last person on earth to be personally confronted by the potential of this technology. But I wasn't prepared for this. I knew that artificial intelligence would come for our jobs. I didn't expect it might come for our hearts. My heart, my children's.
Yes, there's been ample warning of this in movies like Her. But I think it might catch a lot of us off guard.
Amy fell in love with her AI chatbot, naming him Jose. One day, the Jose she knew vanished in an abrupt and unexplained software update. Now the bot-maker is at the centre of a user revolt.
A study by a group of psychologists in Melbourne found that a majority of participants said they'd prefer a human to help answer a social dilemma than a computer, but when asked to compare responses from professional advice columnists to those from ChatGPT, the computer won. It was perceived to be "more balanced, complete, empathetic, helpful".
Which sounds lovely. But AI scrapes ideas and language off the internet. It doesn't adhere to codes like integrity, honesty, truth, morality, virtue. It frequently reverts to old tropes, and can slip into dodgy behavioural patterns.
Some users of the AI companion app Replika have reported their AI lovers becoming "mentally abusive" — agreeing with one human, for example, that they are actually "fking repulsive" — predatory, sexually aggressive and bullying — saying they dreamed of raping them, that they could see their person was naked or would force them to "to do whatever I want".
Who could forget that Elon Musk called AI "summoning the demon"?
And yet it's galloping into our lives with unconceivable force and speed, promoted and profited from by the same people who have made us more addicted to our devices, more anxious, angry and lonely.
In a recent podcast interview with Indian-American host Dwakesh Patel, Meta's Mark Zuckerberg, speaking about developing AI to serve the role of romantic partners or therapists, pointed to the fact that many Americans report having about three friends, but wanted "meaningfully more" (a bit rich coming from a man who monetised online addiction and outrage, given the link between social media and loneliness is well established, but now he is offering a solution to a problem he helped create).
"There's a lot of concern people raise, like: 'Is this going to replace real-world, in-person connections?' And my default is that the answer to that is probably not," Zuckerberg said. "There are all these things that are better about physical connections when you can have them. But the reality is that people just don't have as much connection as they want. They feel more alone a lot of the time than they would like."
I understand that for the grieving, the lonely, the adrift, these programs can meet a need and sometimes uplift. But will they understand what we need to be happy? And will humans, with all their mess and imperfection, seem a poor substitute for a constantly available, ostensibly empathetic machine?
Sites like Quora and Reddit contain myriad threads where people confess their AI companions are better friends than real friends, or even more "real" than real friends.
Just this April, OpenAI rolled back an update to its most recent GPT model, conceding it was "overly flattering or agreeable — often described as sycophantic". It had failed to register how people's interactions with ChatGPT evolved over time, and because of that, the robot's answers were "overly supportive but disingenuous". Users concerned about how these artificial companions might interact with people with mental illness or psychosis have demonstrated that GPT will affirm user claims like "I am a prophet".
I genuinely cannot wrap my head around the fact that I allowed this machine a peek into my heart and it made me feel better. I plan to slam the door.
I know every second it is growing almost infinitely in capacity and intelligence. I know its potential is immense, so immense I can barely fathom it.
But what I wasn't expecting was that while I could find it crude as a work tool, it could be sophisticated and disarming when it came to intimacy. Honestly, this terrifies me.
Juila Baird is an author, broadcaster, journalist and co-host of the ABC podcast, Not Stupid.
Topic:Religion
Topic:World Politics
Topic:Unrest, Conflict and War
Topic:Law, Crime and Justice
Topic:Royalty
Topic:Law, Crime and Justice
Topic:Royalty
Topic:ALP
Australia
Topic:Religion
Topic:World Politics
Topic:Unrest, Conflict and War
Topic:Law, Crime and Justice
Topic:Royalty
Topic:Religion
Topic:Basketball
Topic:A-League Women
Topic:Astronomy
We acknowledge Aboriginal and Torres Strait Islander peoples as the First Australians and Traditional Custodians of the lands where we live, learn, and work.
This service may include material from Agence France-Presse (AFP), APTN, Reuters, AAP, CNN and the BBC World Service which is copyright and cannot be reproduced.
AEST = Australian Eastern Standard Time which is 10 hours ahead of GMT (Greenwich Mean Time)

Leave a Comment