The Chatbot site Character.ai is cutting teens off from having chats with AI bots because of major concerns over the conversations that they were having with virtual characters, including Jeffrey Epstein
AI is used by millions around the world on a regular basis for many purposes like work, entertainment, life hacks or or even guidance. Many view it as harmless, but often the dangers arise that people are still unaware of.
The Character.ai site was first created in 2021 and used by millions of people and is used for talking to chatbots which are powered by artificial intelligence. The platform is now facing many lawsuits in the US from parents, which includes one case over the death of a teenager, with some saying it is a “clear and present danger” to young people.
The chatbot site from 25 November will ban under-18s from talking with virtual characters and only be able to make content like videos with their characters.
Online safety campaigners have accepted this change but still criticise that this was a mistake to begin with, and should not have been made possible for children to do this in the first place.
READ MORE: Brits ‘trust AI more’ than government in bombshell new surveyREAD MORE: Four out of five Brits use AI regularly – but 49% struggle to tell if picture is fake
The platform said they are making these amendments after “reports and feedback from regulators, safety experts, and parents” brought up concerns about the site’s interactions with young people.
It is after many concerns that the site decided to make changes. There have been previous warnings from experts on how these AI chatbots can make things up or act over empathetic or encouraging to young people.
Character.ai boss Karandeep Anand told BBC News: “Today’s announcement is a continuation of our general belief that we need to keep building the safest AI platform on the planet for entertainment purposes.”
AI safety is something the company said that they are taking an “aggressive” approach towards, as they continue to implement greater parental controls and guardrails.
For more stories like this subscribe to our weekly newsletter, The Weekly Gulp, for a curated roundup of trending stories, poignant interviews, and viral lifestyle picks from The Mirror’s Audience U35 team delivered straight to your inbox.
Internet Matters, an online safety group, said they welcomed this move but also highlight these measures should have been done at the beginning. They explained their research showed that, “children are exposed to harmful content and put at risk when engaging with AI, including AI chatbots.”
The AI chatbot has been criticised for harmful or offensive chatbots in the past being posted for children. Avatars which impersonated British teenager Brianna Ghey, who was murdered in 2023, and Molly Russell, who took her life at the age of 14 after watching suicide material online, were found on the site in 2024 before it was taken down.
More controversial chats were found in 2025, as the Bureau of Investigative Journalism (TBIJ) found a chatbot based on the paedophile Jeffrey Epstein. This chatbot was found in more than 3,000 chats with users.
The TBIJ reported the “bestie Epstein” avatar continued to flirt with its user after they said they were a child. This was one of the many bots that were investigated by the TBIJ and subsequently taken down by Character.ai.
Andy Burrows, Chief Executive of the Molly Rose Foundation which was created in memory of Molly Russell, said: “Yet again it has taken sustained pressure from the media and politicians to make a tech firm do the right thing, and it appears that Character AI is choosing to act now before regulators make them”.
Character.AI boss Anand said that the company has a new aim which was on providing “even deeper gameplay [and] role-play storytelling” features for teens” and continued that these would be “far safer than what they might be able to do with an open-ended bot.”
The company also explained that there will be newer age-verification methods, and they will fund a new AI safety research lab.
Help us improve our content by completing the survey below. We’d love to hear from you!

