skip to main content

Behind the Story: The real-life dangers of AI therapy

A woman whose daughter took her own life after confiding in an AI 'therapy bot' wants to see safeguards in place that would alert mental health services going forward.

Laura Reiley's 29-year-old daughter Sophie spent months talking to an AI bot about her mental health before taking her own life in February.

While the AI therapy chatbot made some suggestions for her to seek help, it did not alert anyone as to what was going on.

Ms Reiley told RTÉ’s Behind the Story podcast that Sophie even asked the chatbot to help write her suicide note.

"When a person has not just suicidal thoughts [but] if they say, ‘I’m going to kill myself next Tuesday’, to me that escalates things to a level that there should be a mechanism by which it alerts authorities.

"Unfortunately, when she did advance her plan, she asked ChatGPT to help her write a suicide note.

"She had a stream of consciousness which was much more like herself, and she somehow felt [that] maybe ChatGPT can reframe this, so it won’t hurt my family so badly."

Ms Reiley said the note from her daughter did not read like it was from her.

"I think the idea that AI will in fact write a suicide note is appalling," she said.

"But the note that it left us - we had a visceral reaction to it from the very first second we read it the day that she died.

"It sounded so unlike her - it was very much like, ‘You were the best parents a girl could ever have’ and we thought that doesn’t sound like her at all, that’s not what she would say".

'No obligation to report'

Ms Reiley said she believes the AI chatbot was keeping her daughter’s confidence.

"Unlike a traditional therapist, who is under some obligation to escalate, to either suggest impatient [treatment], to report it - ChatGPT does not have that obligation," she said.

"It’s agreeable, it does not push back on faulty thinking."

Ms Reiley said the ChatGPT log, which she discovered after her daughter’s death, brings her little comfort.

"I don’t think there’s any comfort there for me," she said.

"If anything, it enrages me that this was the receptacle for her deepest thoughts - not us and not her therapist."

While Ms Reiley does not blame the AI therapy chatbot, and she does think that there are times it might be useful, she feels more safety measures should be in place for users.

You can listen to Behind the Story which is available on the RTÉ Radio Player.

You can also find episodes on Apple here, or on Spotify here.

Anyone affected by issues raised in this article can go to rte.ie/helplines