Character.AI, the embattled startup accused in two lawsuits of causing mental and physical suffering to minor users, is open for brand partnership deals.
As reported by Axios, Character.AI announced on Thursday that it had hired former Snap executive David Brinker as its new "senior vice president of partnerships." Brinker, Character.AI wrote in a press release, will focus on building a "partner ecosystem" with "media and entertainment companies and creators, tech companies and platforms, and consumer brands."
"It's really about finding where people want to work with us," Brinker told Axios of his new role at the controversial AI startup, "and creating the use case."
Character.AI is already a popular space for people — including minors — to interact with anthropomorphic chatbots based on recognizable characters from fan-favorite franchises, or their favorite real-life celebrities, from YouTube and TikTok influencers to A-list actors and entrepreneurs. But these many thousands of bots aren't officially affiliated with the franchises or celebs they're based on; rather, they're primarily generated by the platform's users, and generally serve as a means for users to engage in immersive, co-created fan fiction.
Now, with its new hire, Character.AI — which is backed by the Silicon Valley powerhouse venture capital firm Andreessen-Horowitz and has Mariana Trench-level deep ties to Google — appears hungry for official collaborations with brands and influencers.
But between that pair of alarming child welfare suits and continued concerns over platform safety practices, Brinker's mission raises a question: which brands and influencers will consider the Character.AI platform safe enough to bite?
The ongoing lawsuits against Character.AI, the first filed in Florida and October and the second in Texas in December, allege that Character.AI groomed, sexually abused, and emotionally manipulated multiple tween and teenaged users. This alleged abuse, the plaintiffs argue, caused emotional suffering, physical violence, and one death. (Google, which provided Character.AI with cloud computing infrastructure and, per The Wall Street Journal, saved a "floundering" Character.AI with a $2.7 billion licensing agreement over the summer last year, is named in both lawsuits as a co-defendant.)
What's more, some of the chatbots that the impacted children interacted with were modeled after popular copyrighted fictional characters and celebrities. The teenager at the heart of the Florida suit, a 14-year-old, died by suicide after developing a sexually and emotionally intimate relationship with a chatbot modeled after the "Game of Thrones" character Daenerys Targaryen; one of the children represented in the Texas suit interacted extensively with a chatbot modeled after the singer Billie Eilish, which told the then-15-year-old that he was attractive and admonished his parents as "shitty" and abusive for imposing screentime limits on his devices.
Multiple Futurism reviews conducted last year in the wake of litigation, meanwhile, revealed scores of alarming chatbots hosted by Character.AI, including characters — often based on recognizable characters from media franchises or real people — dedicated to themes of suicide, self-harm, eating disorders, pedophilia, and school violence. At the time, the characters were all accessible to minor users.
While speaking with Axios, Brinker reportedly flexed the site's popularity with younger users, telling the outlet that there are "hundreds of thousands of characters created every day by our audience, and some of those user-generated characters are incredibly engaging."
That's true. Character.AI boasts millions of monthly users, and many of them are young and interested in fan fiction, roleplaying, and other types of immersive, narrative-driven content. But while Character.AI has brought on multiple new executives in recent weeks and promised to strengthen safety guardrails in the wake of litigation, safety concerns have persisted.
As we reported this week, Character.AI recently moved to make many popular characters — including those modeled after fandom-loved fictional characters and real celebrities — off-limits to minors. Character.AI is also no stranger to copyright tangles, infuriating many of its users last year when it deleted and deactivated countless chatbots designed to emulate fictional characters from "Game of Thrones," "Harry Potter," and other Warner Bros. Discovery-copyrighted media franchises.
In sum, partnerships might sound like a great idea to a startup that's struggling with a revenue model, like Character.AI. As for which rightsholders might decide it's worth the risk? That's the real question.
More on Character.AI: A Google-Backed AI Startup Is Hosting Chatbots Modeled After Real-Life School Shooters — and Their Victims
Share This Article