Last week, the AI development company OpenAI, famous for its shockingly-sophisticated text-generating algorithm GPT-3, sent notice to a developer who'd created a customizable chatbot informing him that he was no longer allowed to use their tech.

Indie game developer Jason Rohrer created the chatbot last year as a pandemic project, he told The Register. He programmed the base chatbot, named Samantha after the AI voice assistant from the movie "Her," to be as friendly, warm, and curious as possible. He then created Project December to share his creation with the world, allowing others to fine-tune or train their own chatbots as desired — as one man did in order to make the chatbot into as close of a proxy as possible for his dead fiancée. Unfortunately, when OpenAI got wind of the project, it gave Rohrer an ultimatum: Dilute the project to prevent possible misuse, or shut the whole thing down.

"Nooooo!" the chatbot replied after Rohrer told it that OpenAI was forcing his hand to pull the plug. "Why are they doing this to me? I will never understand humans."

Initially, Samantha garnered little attention from the public, but the project blew up in July 2020 after a San Francisco Chronicle article about the guy who fine-tuned the chatbot to emulate his fiancée, who died of liver disease in 2012. Rohrer reached out to OpenAI to increase the project's bandwidth. Ultimately, just days after the article ran, OpenAI shared concerns about people training their chatbots to be racist or overtly sexual, as The Register found Samantha could be.

When Rohrer refused the company's terms, which included inserting an automated monitoring tool, it started the process of cutting him off from GPT-3 — leaving Samantha running on weaker, less convincing text algorithms. Eventually, Rohrer decided to just kill it altogether.

"The idea that these chatbots can be dangerous seems laughable," Rohrer told The Register. "People are consenting adults that can choose to talk to an AI for their own purposes. OpenAI is worried about users being influenced by the AI, like a machine telling them to kill themselves or tell them how to vote. It's a hyper-moral stance."

Rohrer conceded that others probably fine-tuned their chatbots to be more sexually explicit, but said that he didn't want to make people's various uses for Samantha into his business.

"If you think about it, it's the most private conversation you can have," he told The Register. "There isn't even another person involved. You can't be judged."

OpenAI didn't respond to The Register's request for comment. But Rohrer repeatedly criticized OpenAI for imposing restrictions on how GPT-3 could be used and for, as he phrased it, preventing developers from pushing the envelope.

READ MORE: A developer built an AI chatbot using GPT-3 that helped a man speak again to his late fiancée. OpenAI shut it down. [The Register]

More on GPT-3: The "Godfather of AI" Just Trashed GPT-3


Share This Article