Artificial intelligence has torn through many industries since the debut of ChatGPT in 2022, but there’s probably no single area where it’s had a clearer material impact than software development.
Programmers running the gamut from experienced to novice have embraced the tech, using chatbots and specialty tools to quickly generate code from natural language prompts. “Vibe coding,” as it’s come to be known, lets almost anyone churn out entire apps in little time — even if they have little or no technical chops.
On a certain level, you have to admit that’s pretty cool. But as we’re learning time and again, it also has distinct downsides.
One particularly glaring drawback is that a lot of vibe-coded software is now being deployed with gaping security flaws. In the latest sign that we may be veering into an AI-enabled hack-pocalypse, a fascinating new Wired story covers research by a cybersecurity firm called RedAccess that found sprawling privacy issues in vibe-coded apps.
The firm examined thousands of web apps created with the vibe coding platforms Lovable, Replit, Base44, and Netlify. What it found was, to put it lightly, not good: 5,000 of them had “virtually no security or authentication of any kind,” and a full 40 percent exposed users’ sensitive data, from medical and financial info to corporate documents and logs of ostensibly private chatbot conversations.
“The end result is that organizations are actually leaking private data through vibe-coding applications,” RedAccess cofounder Dor Zvi told Wired. “This is one of the biggest events ever where people are exposing corporate or other sensitive information to anyone in the world.”
The vibe coding platforms’ response to the embarrassing revelations left something to be desired. Netlify ignored it completely, while the other platforms basically deflected blame onto users, saying they should have better secured their work before putting it out into the world.
“We’re treating this as an ongoing matter,” a Lovable spokesperson told Wired. “It’s also worth noting that Lovable gives builders the tools to build securely, but how an app is configured is ultimately the creator’s responsibility.”
On a certain level they’re right, but these are also the companies claiming that creating software is now as simple as describing it to an AI bot. The reality is that AI remains extremely imperfect, so the resulting code is going have issues that only an experienced human developer or security expert would be able to identify — and these apps, fundamentally, are in the market of putting those people out of business.
“Anyone from your company at any moment can generate an app, and this is not going through any development cycle or any security check,” Zvi told Wired. “People can just start using it in production without asking anyone. And they do.”
More on vibe coding: Entirely Vibe-Coded Operating System Is a Bug-Filled Disaster