Pymetrics’ open source tool can detect bias in AI algorithms
New York-based AI startup Pymetrics has announced it has open-sourced its much-needed tool for detecting bias in algorithms.
Earlier this month, I posted an editorial titled ‘Stopping AI’s discrimination will be difficult, but vital’ on our sister site AI News. I stand by the necessity of it, but Pymetrics is making it much easier to prevent unintentional discrimination.
Now available on GitHub, the company’s so-called Audit AI will scan algorithms for anything which is being favoured or disadvantaged.
Pymetrics’ background is “matching talent to opportunity, bias-free” to help ensure businesses are recruiting the best candidates for their job.
Machine learning technologies can vastly and rapidly change or reinforce power structures
There’s an under-representation of certain groups across society, but forcing businesses to recruit them even if they’re not the right fit for the job is not the ideal solution. Instead, Pymetrics wants to level the playing field so everyone has the same opportunity.
Unfortunately, the current under-representation problem is causing unintentional bias. Here in the West, technologies are still mostly developed by white males and can often unintentionally perform better for this group.
A 2010 study by researchers at NIST and the University of Texas in Dallas found that algorithms designed and tested in East Asia are better at recognising East Asians, while those designed in Western countries are more accurate at detecting Caucasians.
Audit AI can detect this kind of bias to make the developers aware, but it will be up to them to correct it.
“From policing, to welfare systems, online discourse, and healthcare – to name a few examples – systems employing machine learning technologies can vastly and rapidly change or reinforce power structures or inequalities on an unprecedented scale and with significant harm to human rights,” wrote digital rights campaigners Access Now recently in a post.
Of course, some are making their algorithms discriminate on purpose.
What are your thoughts on Pymetrics’ bias-detecting tool? Let us know in the comments.
Interested in hearing industry leaders discuss subjects like this and sharing their IoT use-cases? Attend the IoT Tech Expo World Series events with upcoming shows in Silicon Valley, London and Amsterdam to learn more.
- » Study highlights impact of self-driving cars on urban tourism
- » Volvo and Ericsson announce five-year connected vehicle cloud deal
- » Global wearable device sales to rise 26% in 2019, says Gartner
- » Nordex uses SoftwareAG’s Cumulocity IoT to manage turbines
- » Mozilla assesses the security of this season’s IoT gifts