Slippery Slope

Apple Will Scan Every iPhone for Images of Child Sexual Abuse

"This will break the dam — governments will demand it from everyone."

Futurism
Image by Futurism

Apple has announced that it will scan every photo uploaded to its iCloud Photos in the US for images of child sexual abuse.

The tech giant, however, isn’t algorithmically scanning each image for problematic content — instead, it will be comparing each photo against a known database of what experts call child sexual abuse material (CSAM).

If it finds any CSAM, it will report the user to law enforcement.

On the surface, the initiative appears to be a morally sound effort to root out sexual abuse and identify perpetrators. But privacy advocates worry that the invasive technology could eventually be expanded to scan phones for other types of content as well — think pirated files, or signs of political dissent in totalitarian countries.

Advertisement

It’s a particularly troubling development because Apple has a huge interest in continuing to sell its products in China, a country where the government has put tremendous amounts of pressure on tech companies to give it access to user data.

In fact, Apple has already given into China’s demands in the past, agreeing to host user data for Chinese users in Chinese data centers, as Wired points out.

The system will be included in upcoming versions of iOS, macOS, and iPadOS. Images uploaded to iCloud Photos, Apple’s cloud-based photo storing service, will be compared to a list of images compiled by the US National Center for Missing and Exploited Children and other child safety organizations.

“Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices,” the company’s website reads.

Advertisement

In other words, files will be stored as hashes and then compared to the known database of CSAM. “Apple does not learn anything about images that do not match the known CSAM database,” the company clarified in a PDF accompanying the announcement.

Even slightly edited versions of these known images will trigger an alert, according to Apple.

Apple has used a similar hashing system to scan for CSAM being sent via email, as has Google’s Gmail, as The Verge points out.

So what about false positives? Being falsely accused of child pornography could be a huge problem. Apple claims “less than a one in one trillion chance per year of incorrectly flagging a given account.”

Advertisement

Employees will also manually verify any flags and only then make the decision to call the cops.

Despite the safeguards, though, security researchers argue it’s a slippery slope.

“Regardless of what Apple’s long term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content,” Matthew Green, a security researcher at Johns Hopkins University, told the BBC.

“Whether they turn out to be right or wrong on that point hardly matters,” he added. “This will break the dam — governments will demand it from everyone.”

Advertisement

Nadim Kobeissi, a cryptographer, told Wired that the initiative is a “very, very slippery slope” and that he “definitely will be switching to an Android phone if this continues.”

READ MORE: Apple to scan iPhones for child sex abuse images [BBC]

More on sexual abuse: People Caught Using AI to Role-Play Sexual Abuse of Children


Care about supporting clean energy adoption? Find out how much money (and planet!) you could save by switching to solar power at UnderstandSolar.com. By signing up through this link, Futurism.com may receive a small commission.

Advertisement

Share This Article

Keep up.
Subscribe to our daily newsletter to keep in touch with the subjects shaping our future.
I understand and agree that registration on or use of this site constitutes agreement to its User Agreement and Privacy Policy

Advertisement

Copyright ©, Camden Media Inc All Rights Reserved. See our User Agreement, Privacy Policy and Data Use Policy. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with prior written permission of Futurism. Fonts by Typekit and Monotype.