File under: the most depressing hierarchy of emotions you will ever see.

Build-a-Bot

Another day, another step closer to the normalization of build-your-own AI chatbot partners.

Per Decrypt, top-shelf Silicon Valley VC firm Andreessen Horowitz last week took to the developer site GitHub to lay out detailed instructions on how to build an AI companion bot from scratch. The VC outfit has a lot of money in various AI ventures, the billion-dollar AI companion startup Character.AI included; now, it seems that the folks at the firm are so enthusiastic about companion bots that they're encouraging curious developers out there to start DIYing versions for themselves — and among several other potential use cases, it feels notable that romantic partnership was listed as use case number one.

"There are many possible use cases for these companions — romantic (AI girlfriends / boyfriends), friendship, entertainment, coaching, etc," reads the description, noting elsewhere that the "project is purely intended to be a developer tutorial and starter stack for those curious on how chatbots are built."

So it's like Build-a-Bear, but if said bear was a bespoke AI lover, pal, or otherwise. And if, of course, the supplier of the Build-a-Bear materials happened to be one of Silicon Valley's top VC firms, which itself happened to have untold sums invested into a buzzy AI companion startup.

Hierarchy of Emotions

Seriously, the Andreessen Horowitz obsession with bringing AI companion bots mainstream is no joke. In a May blog post about the tech, firm partner Connie Chan even introduces a graphic dubbed the "Hierarchy of Emotions Mapped to Social Apps," which, yes, is as depressing as it sounds. In that hierarchy graphic, the firm argues that while posting and racking up followers on social media allows us to be seen, admired, and find belonging, it's companion chatbots that will finally allow us to feel "understood" by a digital service.

The fact that the bots come off as understanding because they're mimicking much-needed human interaction aside, it's pretty hazy whether companion bots will actually be all that great for us. It's been alleged that a chatbot was implicated in the suicide of a Belgian man, while the recent sentencing of a would-be assassin revealed the attempted killer to have found support for his deadly plot in his Replika AI companion. And elsewhere, experts are warning that being able to construct and control a "perfect" partner may not be the best training ground for healthy human-to-human interactions.

But if the VC firm's recent GitHub post makes anything clear, it's that regardless of the risks, some of the deepest pockets out there are hoping to see an AI companionship future through. Buckle up, kids.

More on AI romance: Experts Say AI Girlfriend Apps Are Training Men to Be Even Worse


Share This Article