To ensure Google's latest face unlock feature would work on all types of faces, the company needed to train the system using — you guessed it — all types of faces.

That's Machine Learning 101, right?

But the way the megacorporation's contractors went about collecting that data was downright devious, according to a New York Daily News investigation, which noted that they targeted — and lied to — homeless people to get scans of their faces.

To collect the face scans, Google hired the employment firm Randstad. Randstad, in turn, hired contractors that hit the streets in various U.S. cities armed with face-scanning phones and $5 gift cards for anyone willing to submit to a scan.

Randstad supervisors instructed the contractors to "specifically" target homeless black people, one source told the Daily News, adding that the homeless people "didn’t know what was going on at all."

Even worse: Randstad supervisors instructed contractors to go out of their way to conceal the purpose of the face scans.

The Daily News' sources said they were told to frame the face scans as a "selfie game" or "mini-game." Supervisors also instructed them to rush their targets through the necessary consent agreements and to walk away from anyone who seemed "suspicious" about the collection.

And if the person directly asked if the phone was taking a video? According to one source, contractors were instructed to just say, "Oh it’s not really."

The underlying intention of this whole debacle — Google wanting to ensure its face-scanning tech wouldn't be biased — is actually a positive development in the field of artificial intelligence, which is currently rife with racist, sexist technology.

But if the claims of the Daily News' sources are true, the tech giant needs to do a far better job of vetting the companies it hires to help it achieve that goal.

READ MORE: Google using dubious tactics to target people with ‘darker skin’ in facial recognition project: sources [New York Daily News]

More on biased AI: Microsoft Announces Tool to Catch Biased AI Because We Keep Making Biased AI


Share This Article