An admission of guilt? OpenAI says no.

Movie Madness

OpenAI is apparently not feeling too flattered about those comparisons to the movie "Her" anymore. On Sunday, the Microsoft-backed startup announced that it was pausing the use of Sky, a voice available for the latest version of ChatGPT that can have spoken conversations in realtime, after users pointed out that it sounded a lot like the actress Scarlett Johansson.

In "Her," Johansson voices an AI chatbot named Samantha that the film's melancholic protagonist, played by Joaquin Phoenix, falls in love with after talking to her through his phone and computer.

Those parallels didn't go unnoticed by users of GPT-4o's flagship "Voice Mode," who have joked that the Sky voice should be called Samantha. But OpenAI, in its latest blog post, insisted that the similarities are merely a coincidence.

"We believe that AI voices should not deliberately mimic a celebrity's distinctive voice — Sky's voice is not an imitation of Scarlett Johansson but belongs to a different professional actress using her own natural speaking voice," the blog post read. "To protect their privacy, we cannot share the names of our voice talents."

Coy Copycat

The denial raises more questions than it answers. If Sky isn't an imitation of ScarJo, then why pause the use of the voice? It would seem that this is less a case of buckling to community scrutiny, and more of OpenAI walking on legal eggshells. Johansson, after all, hasn't balked at suing companies as massive as Disney in the past.

OpenAI points to the fact that Sky is voiced by a different actress as evidence of its innocence. Of course, that doesn't preclude the actress having been directed to evoke Johansson's likeness in the performance.

Whether that's the case, OpenAI's CEO Sam Altman has only fueled the comparisons. He's spoken about how "Her" is one of his favorite sci-fi movies, calling it "incredibly prophetic." And on the day that GPT-4o was released with "Voice Mode," Altman cheekily tweeted the single word "her" — directly linking the update with the movie.

We suspect that if there are any legal troubles brewing related to this — the company's plausible-deniability-speak may be evidence of that — OpenAI will want to handle this behind closed doors. It certainly already has enough lawsuits on its plate already.

More on OpenAI: Machine Learning Researcher Links OpenAI to Drug-Fueled Sex Parties

Share This Article