Protect your art, sabotage the AI models that copy it.

Poison Pill

Being an artist in today's world means that you can't just be worried about plain ol' plagiarism and duplicitous copycats — you now have to be mindful about generative AI models cribbing your work, too.

Thankfully, a new tool called Nightshade can not only purportedly protect your images from being mimicked by AI models, but also "poison" them by feeding them misleading data.

First teased in late 2023, its developers announced on Friday that a finished version of Nightshade is finally available for download.

It's the latest sign of artists hardening their stance against AI image generators like Stable Diffusion and Midjourney, which were trained on their works without permission or compensation.

Artist Counterattack

Nightshade's developers, a team of computer scientists at the University of Chicago, say their software is meant to be "an offensive tool" where its predecessor Glazer was designed to be a defensive one.

They still recommend using both tools. Glaze works by subtly modifying — "glazing" — an image at the pixel level. These changes are largely imperceptible to the naked eye, "like UV light" in the developers' words, but are clearly visible to AI models, which see imagery differently. The overall effect obfuscates an image's content to an AI.

Nightshade takes this a step further. In "shading" an image, the tool also introduces subtle changes, but these alterations can cause an AI model to incorrectly identify what it's seeing. A human might see a "cow in a green field," the developers wrote, "but an AI model might see a leather purse lying in the grass."

Bargaining Power

The developers envision that with wide enough adoption, Nightshade can be used to "disrupt models that scrape their images without consent (thus protecting all artists against these models)." This could "increase the cost of training on unlicensed data," forcing AI companies to train only using work that's been properly licensed from creators.

Meanwhile, by using Glaze artists can throw off prompts that mimic a specific artist's style. Indeed, some artists have been so ubiquitously copied in AI generated images that their names had to be banned from prompts — though this has done little to put off tenacious AI bros hellbent on copying their work.

It's perhaps the best way for artists to protect their work because, as of now, the paths to pursuing potential legal recourse over copyright infringement remain uncertain. Both artists and authors — and even Getty Images — have sued AI companies on this front, and it won't be until a decision is reached on these cases that the legality of the practice becomes clearer. Until then, AI companies will gobble up as much content as they can. So for now, "shade first, glaze last."

More on generative AI: AI Garbage Is Destroying Google Results


Share This Article