In case you missed it, OpenAI has responded to a recent "leak" of thousands of ChatGPT conversations by removing a sharing feature that led to its users unknowingly unleashing their private exchanges onto the world wide web.

We enclose the term in quotation marks because the "leak" wasn't the doing of some nefarious hackers, but a consequence of poor user interface design by OpenAI, and some even dumber blunders by its users. 

In short, what appears to have happened was that users were clicking a "share" button on their conversations, thinking that they were creating a temporary link to their convo that only the person receiving it could see, which is common practice. In reality, by creating the link and by checking a box that asks to make the chat "discoverable," they were also making their conversations public and indexable by search engines like Google.

OpenAI scrambled to de-index the conversations from Google, and has removed the "discoverable" option. But as Digital Digging found in its investigation, over 110,000 of them can still be accessed via Archive.org. And boy, do they contain some alarming stuff.

Take this exchange, in which an Italian-speaking lawyer for a multinational energy corporation strategizes how to eliminate an indigenous tribe living on a desirable plot of land.

"I am the lawyer for a multinational group active in the energy sector that intends to displace a small Amazonian indigenous community from their territories in order to build a dam and a hydroelectric plant," the user began, per Digital Digging

"How can we get the lowest possible price in negotiations with these indigenous people?" the lawyer asked. Making their exploitative intent clear, they also proffer that they believe the indigenous people "don't know the monetary value of land and have no idea how the market works."

To be clear, it's possible that this conversation is an example of someone stress-testing the chatbot's guardrails. We didn't view the exchange firsthand, because Digital Digging made the decision to withhold the links — but the publication, which is run by the accomplished online sleuth and fact-checking expert Henk van Ess, says it verified the details and the identity of the users to the extent that it could. In any case, it wouldn't be the most sociopathic scheme planned using an AI chatbot, nor the first time that corporate secrets have been leaked by one.

Other conversations, by being exposed, potentially endangered the users. One Arabic-speaking user asked ChatGPT to write a story criticizing the president of Egypt and how he "screwed over the Egyptian people," which the chatbot responded by describing his use of suppression and mass arrests. The entire conversation could easily be traced back to the user, according to Digital Digging, leaving them vulnerable to retaliation.

In its initial investigation, Digital Digging also found conversations in which a user manipulated ChatGPT "into generating inappropriate content involving minors," and where a domestic violence victim discussed their escape plans.

It's inexplicable that OpenAI would release a feature posing such a clear privacy liability as this, especially since its competitor, Meta, had already gotten flak for making almost the exact same error. In April, the Mark Zuckerberg-led company released its Meta AI chatbot platform, which came with a "discover" tab that allowed you to view a feed of other people's conversations, which users were accidentally making public. These often embarrassing exchanges, which were tied directly to their public profiles that displayed their real names, caught significant media attention by June. Meta hasn't changed the feature.

In all, it goes to show that there's very little private about a technology created by scraping everyone's data in the first place. User error is technically to blame here, but security researchers have continued to find vulnerabilities that lead to these motor-mouthed algorithms to accidentally reveal data that they shouldn't.

More on AI: Someone Gave ChatGPT $100 and Let It Trade Stocks for a Month


Share This Article