Meta is demanding access to all of your photos, even the ones you haven't uploaded anywhere yet — and it's being incredibly shifty about what it intends to do with them.

As The Verge reports, the Mark Zuckerberg-owned company refuses to rule out the possibility that it will use your phone's camera roll to train its AI models, and could only provide the assurance that it's not "currently" doing so. If the situation changes in the future, it would be a striking testament to the AI industry's desperation for clean, AI-pollution-free sources of training data.

For context, last week Facebook began showing users a prompt asking them to opt into "cloud processing," TechCrunch reported. Should you consent, this allows Facebook to grab stuff from your camera roll and upload it to Facebook's servers "on a regular basis" so it can generate recaps and "AI restylings" of your photos.

The important detail is that by opting in, Meta is asking you to agree to its AI terms, which state that, "once shared, you agree that Meta will analyze those images, including facial features, using AI." Meta would also gain the right to "retain and use" the information shared with its AI systems.

Your alarm bells should already be ringing. Any data that gets fed into an AI system runs the risk of being coughed up or reproduced in some shape or form. And asking for access to your entire camera roll so Meta's tech can "analyze" your photos is a huge and invasive escalation — it's shameless that Meta's even asking. Apparently, already using everyone's billions of Facebook and Instagram posts made since 2007 wasn't enough for Zuckerberg's tech juggernaut.

Moreover, Meta's AI terms don't make it clear if your unpublished camera roll photos it uses for "cloud processing" are safe from AI training. That's in stark contrast with the terms outlined for apps like Google Photos, the Verge noted, which explicitly state that your personal info won't be used as training data. 

What feels the most telling of all is that, when pressed for comment, Meta wouldn't speak on how it plans to use this data in the future.

 "[The Verge's headline] implies we are currently training our AI models with these photos, which we aren't. This test doesn't use people's photos to improve or train our AI models," Meta public affairs manager Ryan Daniels told the Verge

"These suggestions are opt-in only and only shown to you — unless you decide to share them — and can be turned off at any time," said Meta comms manager Maria Cubeta in a statement, via the Verge. "Camera roll media may be used to improve these suggestions, but are not used to improve AI models in this test."

The spokespeople clarified that opting in gives Meta permission to retrieve 30 days' worth of unpublished camera roll content at a time, rather than all at once — though the terms also state that some of its AI's "suggestions" related to themes like weddings and graduations may be made on photos older than that.

Tech companies have already scraped virtually the entire surface internet for data. That's bad enough, but the saving grace was that this was — at least ostensibly, rather than in actual fact — content that people consciously chose to make public. Now, Meta is taking a big step towards dissolving that thin barrier and could make it the norm for people to divulge their unpublished content, while pretending it's still safe in their camera rolls. It speaks to just how badly the tech industry is aching for fresh brain food for its AI models, which are under threat of collapsing on their current diet of increasingly AI-generated gruel as it becomes harder to access pure sources of human-made content.

While training may be off the table for now, Meta is certainly letting its AI process the content in your camera roll and your social media profiles. In its coverage, TechCrunch spotted some Facebook users who complained that their old photos had automatically been turned into anime-style images using Meta AI.

More on AI training: Anthropic Shredded Millions of Physical Books to Train its AI


Share This Article