Language:

Search

Leading AI company to ban kids from chatbots after lawsuit blames app for child's death

  • Share this:
Leading AI company to ban kids from chatbots after lawsuit blames app for child's death

This story discusses suicide. If you or someone you know is having thoughts of suicide, please contact the Suicide & Crisis Lifeline at 988 or 1-800-273-TALK (8255).

 
 

Popular artificial intelligence (AI) chatbot platform Character.ai, widely used for role-playing and creative storytelling with virtual characters, announced Wednesday that users under 18 will no longer be able to engage in open-ended conversations with its virtual companions starting Nov. 24.

The move follows months of legal scrutiny and a 2024 lawsuit alleging that the company’s chatbots contributed to the death of a teenage boy in Orlando. According to the federal wrongful death lawsuit, 14-year-old Sewell Setzer III increasingly isolated himself from real-life interactions and engaged in highly sexualized conversations with the bot before his death.

In its announcement, Character.ai said that for the following month chat time for under-18 users will be limited to two hours per day, gradually decreasing over the coming weeks.

 

LAWMAKERS UNVEIL BIPARTISAN GUARD ACT AFTER PARENTS BLAME AI CHATBOTS FOR TEEN SUICIDES, VIOLENCE

boy on laptop

A boy sits in shadow at a laptop computer on Oct. 27, 2013. (Thomas Koehler/Photothek / Getty Images)

"As the world of AI evolves, so must our approach to protecting younger users," the company said in the announcement. "We have seen recent news reports raising questions, and have received questions from regulators, about the content teens may encounter when chatting with AI and about how open-ended AI chat in general might affect teens, even when content controls work perfectly."

PARENTS BLAME CHATGPT FOR SON’S SUICIDE, LAWSUIT ALLEGES OPENAI WEAKENED SAFEGUARDS TWICE BEFORE TEEN’S DEATH

 
illustration of character.ai logo on phone

Character.ai logo is displayed on a smartphone screen next to a laptop keyboard. (Thomas Fuller/SOPA Images/LightRocket / Getty Images)

The company plans to roll out similar changes in other countries over the coming months. These changes include new age-assurance features designed to ensure users receive age-appropriate experiences and the launch of an independent non-profit focused on next-generation AI entertainment safety.

 

"We will be rolling out new age assurance functionality to help ensure users receive the right experience for their age," the company said. "We have built an age assurance model in-house and will be combining it with leading third-party tools, including Persona."

kid's hands on laptop

A 12-year-old boy types on a laptop keyboard on Aug. 15, 2024. (Matt Cardy)

CLICK HERE TO DOWNLOAD THE FOX NEWS APP

 

Character.ai emphasized that the changes are part of its ongoing effort to balance creativity with community safety.

"We’re working to keep our community safe, especially our teen users," the company added. "It has always been our goal to provide an engaging space that fosters creativity while maintaining a safe environment for our entire community."

Olivia Smith

Olivia Smith

News: Get latest stock share market news, financial news, news, economy news, company news, politics news, breaking news at Business Standard. Catch all the latest Nifty Sensex news live updates.