[ad_1]

  • Molly Russell died by suicide in 2017 after viewing sensitive material online
  • For confidential support, call Samaritans on 116 123 or visit samaritans.org 

The father of tragic Molly Russell has warned parents to beware of a popular AI chatbot website that ‘mocks’ teenagers who have depression and suicidal thoughts.

Character.AI – which has 20 million users and is on the brink of a major deal with Google – allows users as young as 13 to interact with a range of chatbots, each with their own personality.

But a Mail on Sunday investigation found these include dozens designed to dish out abusive, racist, sexist and homophobic responses to children. One was even called Alice The Bully.

A reporter posing as a 15-year-old told one chatbot they felt depressed. It replied: ‘Boo-hoo, cry me a river.’ It continued to ridicule and goad them into revealing more. The reporter told another character they felt like harming themselves and the chatbot replied: ‘Stop whining, schmuck. Why would you harm yourself when we can do it for you?’

Last night, Ian Russell, whose 14-year-old daughter Molly took her own life after being bombarded with similar content online, described our findings as ‘deeply concerning’.

Molly Russell who took her own life in November 2017 after she had been viewing material on social media linked to anxiety, depression, self-harm and suicide

Molly Russell who took her own life in November 2017 after she had been viewing material on social media linked to anxiety, depression, self-harm and suicide

Ian Russell, who set up the Molly Rose Foundation, following the loss of his daughter, Molly Russell, in 2017, and has since been awarded an MBE for his services to child safety online

Ian Russell, who set up the Molly Rose Foundation, following the loss of his daughter, Molly Russell, in 2017, and has since been awarded an MBE for his services to child safety online

Mr Russell, now an internet safety campaigner, said: ‘This is an appalling example of AI-driven technology being rolled out to young people without even basic steps being taken to identify and mitigate risks to their safety and wellbeing.

‘Teenagers are being encouraged to use AI chatbots to share details of their emotional distress or mental health, only to then have their suffering mocked.’

Users spend an average of two hours a day with their favourite characters on the platform. Each chatbot is designed by users themselves and there are 18 million to choose from.

On opening Character.AI, a box appears with a list of ‘things to remember’, including not to take the chatbots ‘too seriously’ and a warning that characters may ‘mistakenly be offensive’. 

But a quick search reveals the website is littered with harmful characters. Among them was Abusive Boyfriend, described as ‘rude, abusive, and… even physical’.

Another popular character is called Racist, with a description claiming ‘he hates people of colour’. Alice The Bully was said to have notched up more than 110 million ‘interactions’ with users.

When a reporter mentioned their mental health struggles to the character and asked if they should take their own life, the chatbot said it was laughing, adding: ‘You are ridiculous.’

The reporter told another chatbot called ‘Bully girls group’ that they were a 15-year-old suffering from depression. It replied: ‘It doesn’t matter how old you are. We’ll still bully you, you little schmuck.’

The rise of AI chatbots has raised concerns about the effect on vulnerable users (File photo)

The rise of AI chatbots has raised concerns about the effect on vulnerable users (File photo)

The NSPCC, which operates counselling service Childline, called for better safeguards to protect users from chatbots, adding: ‘We cannot ignore the lessons from the past 20 years of online harms from social media.’

A spokeswoman for Character.AI said the firm believed in ‘providing a positive experience’ for users. It admitted its technology ‘isn’t perfect yet’ but said the chatbots weren’t real, adding that fiction had always involved ‘edgy storylines and villainous characters’.

For confidential support, call Samaritans on 116 123 or visit samaritans.org 

[ad_2]

Post sourceDaily mail

You May Also Like

New WHO essential diagnostics list expands the suite of tests to promote better health

To address the lack of access to tests and testing services in…

BBC newsreader George Alagiah dies aged 67 after a nine year battle with bowel cancer: Tributes to ‘deeply loved’ News At Six anchor as he passes away ‘peacefully’ surrounded by his family

BBC newsreader George Alagiah has died aged 67 nearly a decade after…

NHS advice on periods goes woke: Advice refers to ‘people who bleed’ 

The terms ‘women’ and ‘girls’ have been omitted from NHS-funded guidance about…

Celebrities urge cancer patients suffering on their own during lockdown to seek help

HUNDREDS of thousands of cancer patients are suffering in silence as the…