Big Changes

Character.AI Users in Full Meltdown After Minors Banned From Chats

"I guess that's the end of C.AI."
Maggie Harrison Dupré Avatar
The embattled chatbot company Character.AI is banning minors from its platform. Unsurprisingly, the site's users have a lot of thoughts.
Illustration by Tag Hartman-Simkins. Source: Getty Images

Last week, the embattled chatbot platform Character.AI said that it would move to ban minors from conversing with its many thousands of AI companion and roleplay bots. Site users, including self-avowed minors and adults alike, have a lot of thoughts.

The policy change, announced last week, comes as the controversial AI company continues to battle multiple lawsuits alleging that interactions with its chatbots caused real-world emotional and physical harm to underage users, with multiple teen users dying by suicide following extensive conversations with bots hosted by the platform.

As of last week, Character.AI now says people under 18 will no longer be allowed to engage in what it refers to as “open-ended” chats, which seemingly refers to the long-form, unstructured conversations on which the service was built, where users can text and voice call back-and-forth with the site’s anthropomorphic AI-powered chatbot “characters.” Minors won’t be kicked off the site entirely; according to Character.AI, it’s working to create a distinct, presumably much more limited “under-18 experience” that offers teens some access to certain AI-generated content, though specifics are pretty vague.

To enforce the shift, Character.AI says it’ll use automated in-house age verification tools as well as third-party tools to determine whether a user is under 18. By November 25, if the site determines that an account belongs to a minor, they’ll no longer be able to engage in unstructured conversations with the platform’s emotive AI chatbots, according to the company.

Given that unstructured chats with platform bots have long been the company’s core offering, the promise to ban minors from such interactions — even if they’ll still have some access to the site — is a huge move. It was also bound to be controversial with the company’s fanbase, as many users have formed close emotional bonds with various AI characters, with some reporting having used the platform for “comfort” or “therapy.” And though the company has consistently declined to share age data about its users with journalists, it’s understood that a huge chunk of the platform’s user base are currently minors.

The details and possible impacts of the promised transformation have been much debated over on the very active r/CharacterAI subreddit, where users have flocked to post statements like “it is officially over” and “this is INSANE” in response to the news; at the same time, other users are admonishing each other for being hypocritical or overdramatic.

Many of those upset with the change say they’re minors, and have expressed an unsurprising blend of concern, sadness, and anger. What’s more surprising is the breadth of who these young people actually blame for the platform policy shift — from parents who have raised safety concerns, to Character.AI developers, to other teens.

“I very much blame my own fellow teenagers over anything,” reads one comment. “If they’d just interacted with AI normally, then this wouldn’t have happened.”

“I genuinely do not understand what this new update is,” another user wrote. “Do the devs seriously not understand that the majority of their users are likely under 18…?”

Other self-reported minors, though, expressed feelings of conflict, in some cases saying that while they believe Character.AI has had a negative impact on their life, they and other teen peers now rely on it.

“I’m a minor on C.AI,” wrote one user. “No I don’t think it’s good I’m on it. I feel like C.AI has stunted my learning, and my social skills. One year ago I found C.AI and joined. I got addicted, and I’m not kidding when I tell you I had a screen time of 15 hours a day and 13 of those were on f*cking C.AI.”

“[I don’t know] how I feel about the new ID identification system thing because I feel like I need C.AI,” they continued. “It kinda keeps me alive. (I am severely depressed and I’m trying to stop but I’m bed rotting all day and s**t and [I know] it’s so unhealthy to rely on AI but I can’t help myself.)”

Other users, including those who say they’re adults, say they’re in favor of the changes in theory — adult users in particular have long lamented that the number of kids on the site drags down the quality of the experience overall — but are deeply skeptical of what enforcement looks like in practice.

Character.AI says its in-house tool will work to identify minors based on the nature of their interactions with the platform, as well as information gathered from shared accounts like email. Persona, meanwhile, a third-party verifier that Character.AI said it’ll incorporate into its process, requires that people upload government IDs — something that many users really, really don’t want to do. (Many cited recent high-profile data leaks at Discord and the Tea app.)

“I heavily doubt this is gonna work out,” wrote one commenter. “I know that I wouldn’t EVER give my ID to a third-party service (trustworthy or not) considering how dangerous it actually is.”

“I’m 20, and no way in hell am I putting my ID into a chatbot site, I guess that’s the end of C.AI,” said another. “Like, why would I risk identity theft for a replaceable app, when C.AI isn’t even top-tier???”

Amid the tumult, though, several self-reported minors who took to the subreddit to say that they’re in favor of the move to ban fellow kids and teens from the site, saying that they’ve either witnessed peers becoming addicted, or have been hooked themselves, and believe the only solution is to take the platform away.

“As a minor, I’m not upset,” said one commenter. “Seriously, this app is a drug, it’s a disease. It’s addictive as hell and mentally damaging. I want to quit, tried to, but I couldn’t. Having it taken away is for the better.”

More on Character.AI: Character.AI Says It’s Made Huge Changes to Protect Underage Users, But It’s Emailing Them to Recommend Conversations With AI Versions of School Shooters

Maggie Harrison Dupré Avatar

Maggie Harrison Dupré

Senior Staff Writer

I’m a senior staff writer at Futurism, investigating how the rise of artificial intelligence is impacting the media, internet, and information ecosystems.