Not creepy at all!

Sacha_GPT

Here's a particularly grim new use for AI: Vice reports that there's a new tool to "clone" a real person as an AI-powered romantic companion, with or without the consent of the real person.

"I've been obsessing with OpenAI’s Large Language Model (LLM) and what it can do. I kept on thinking about the ability to create human-like agents that behave and act like humans do but found it hard to evaluate them," Enias Cailliau, the programmer behind the project — dubbed GirlfriendGPT — told Vice. "Then I saw how a ton of AI girlfriend projects popped up with some interesting features."

"Most of them are closed-source," he added. "That made me want to build an open-source version of this so everyone could build their own."

So that's what Cailliau did, apparently. As Vice reports, the programmer used OpenAI's GPT model, in addition to the Google chatbot Bard, the voice-generating AI firm ElevenLabs, and the text-to-image generator Stable Diffusion, to construct what's effectively an open-source Build Your Own On-Demand Girlfriend template, basing the first iteration of the machine on someone close to him: his real-life girlfriend.

Consent Questions

The girlfriend in question, Sacha Ludwig, told Vice that she was fully on board with the project.

"Enias has been talking about AI companions for weeks now," said Ludwig, "so I found it cool that he wanted to try to clone me instead of some random influencer online."

Your boyfriend choosing to clone you over a random influencer feels like a low — and bizarre — relationship bar, but we digress. If Cailliau wants to make AI clones of real people, we're thrilled that he has consent.

That in mind: with an open-source means of trapping just about anyone in AI form, the consent piece should absolutely be front and center. But as it stands, while Cailliau himself got the OK from Ludwig, ensuring consent hardly feels enforceable. After all, with the know-how and enough data, you really could make an on-demand, seemingly guardrail-less AI version of anyone you wanted — partners, sure, but also, in a much creepier turn, people like celebrities, colleagues, and hometown crushes. Ultimately, tech like this could be quite invasive, especially considering how little there is in the way of protective recourse.

That said, perfecting the tech apparently has a long way to go. Vice says that the voice element is still pretty Siri-ish, while the AI-generated images of some of these "girlfriends" that Cailliau's Twitter are, for lack of a better word, haunting.

Still, the project is certainly something to keep an eye on — especially considering how serious the ramifications could be.


Share This Article