"Clearview is a total affront to peoples' rights, full stop, and police should not be able to use this tool."
Clearview AI, the company behind a widely-used facial recognition technology that has already led American police to charge innocent people with crimes they didn't commit — claims to have scraped 30 billion Facebook photos in order to train its AI algorithm, according to comments that CEO Hoan Ton-That provided to the BBC last week.
And if that's not enough, Ton-That said in that same interview that US law enforcement agencies — 3,100 of which, as Engadget reports, have used the database — have apparently used that algorithm to perform nearly one million searches.
To put that 30 billion figure in perspective: as of last quarter last year, Facebook had roughly 2.94 billion active monthly users. So, even with bots taken into consideration, it's safe to say that there are a lot of faces in that database, scraped without the explicit knowledge of Facebook users — and, as Facebook tells it, without permission by the social media giant, either. In fact, Facebook has already sent Clearview at least one cease-and-desist.
"Clearview AI's actions invade people's privacy," a Meta spokesperson told Insider in an email, "which is why we banned their founder from our services and sent them a legal demand to stop accessing any data, photos, or videos from our services."
For his part, Ton-That says that everything Clearview does is perfectly fine. Helpful, even!
"Clearview AI's database of publicly available images is lawfully collected, just like any other search engine like Google," the CEO told Insider. "Clearview AI's database is used for after-the-crime investigations by law enforcement, and is not available to the general public."
"Every photo in the dataset," he continued, "is a potential clue that could save a life, provide justice to an innocent victim, prevent a wrongful identification, or exonerate an innocent person."
But as is to be expected, some experts see this particular cornerstone of the ever-growing — and largely unregulated — police surveillance state in a very different light.
"Clearview is a total affront to peoples' rights, full stop, and police should not be able to use this tool," Caitlin Seeley George, the director of campaigns and operations for the nonprofit digital rights advocacy group Fight for the Future, told Insider.
"Without laws stopping them," George added, "police often use Clearview without their department's knowledge or consent, so Clearview boasting about how many searches is the only form of 'transparency' we get into just how widespread use of facial recognition is."
READ MORE: Clearview AI used nearly 1m times by US police, it tells the BBC [BBC]
More on facial recognition: Man Accuses Cops of Throwing Him in Jail Based on False Facial Recognition Match