Contractors targeted homeless with ‘dark skin’ to train Google’s facial recognition

Contractors working for Google reportedly targeted homeless people with “dark skin” to help train a facial recognition system.

As reported by our sister publication AI News, facial recognition algorithms have well-documented problems when it comes to identifying people of colour. Part of the reason for the disparity is that most data sets for training algorithms have little diversity.

Any responsible tech company will want to ensure their facial recognition technologies are equally accurate across society before they’re further deployed in areas such as law enforcement; which even some police have voiced concerns may lead to increased bias.

However, it seems in a bid to prevent bias with its own facial recognition algorithms, Google has walked right into another controversy. The situation raises further questions about how to ethically harvest data.

The approach taken by the company reportedly contracted by Google is quite possibly the perfect example of what not to do.

Workers were employed by staffing firm Randstad, but directed by Google managers, to specifically seek out people with “darker skin tones” and those who may be enticed by a financial reward in the form of a $5 gift card.

“They said to target homeless people because they’re the least likely to say anything to the media,” a former contractor told the New York Daily News. “The homeless people didn’t know what was going on at all.”

“I feel like they wanted us to prey on the weak,” another contractor said.

Deceptive tactics were said to have been used by the contractors to get people to agree with the face scans; including calling it a ‘selfie game’. Devices that research subjects were given to “play with” were actually taking scans of their face.

The face scans of the participants were harvested to help train Google’s facial recognition algorithms for an improved “face unlock” set to debut in their upcoming flagship smartphone, the Pixel 4.

“This is totally unacceptable conduct from a Google contractor. It’s why the way AI is built today needs to change,” tweeted Jake Snow, an attorney with the ACLU of Northern California. “The answer to algorithmic bias is not to target the most vulnerable.”

 Interested in hearing industry leaders discuss subjects like this? Attend the co-located 5G Expo, IoT Tech Expo, Blockchain Expo, AI & Big Data Expo, and Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London, and Amsterdam.

Related Stories

Leave a comment

Alternatively

This will only be used to quickly provide signup information and will not allow us to post to your account or appear on your timeline.