Pymetrics’ open source tool can detect bias in AI algorithms
New York-based AI startup Pymetrics has announced it has open-sourced its much-needed tool for detecting bias in algorithms.
Earlier this month, I posted an editorial titled ‘Stopping AI’s discrimination will be difficult, but vital’ on our sister site AI News. I stand by the necessity of it, but Pymetrics is making it much easier to prevent unintentional discrimination.
Now available on GitHub, the company’s so-called Audit AI will scan algorithms for anything which is being favoured or disadvantaged.
Pymetrics’ background is “matching talent to opportunity, bias-free” to help ensure businesses are recruiting the best candidates for their job.
Machine learning technologies can vastly and rapidly change or reinforce power structures
There’s an under-representation of certain groups across society, but forcing businesses to recruit them even if they’re not the right fit for the job is not the ideal solution. Instead, Pymetrics wants to level the playing field so everyone has the same opportunity.
Unfortunately, the current under-representation problem is causing unintentional bias. Here in the West, technologies are still mostly developed by white males and can often unintentionally perform better for this group.
A 2010 study by researchers at NIST and the University of Texas in Dallas found that algorithms designed and tested in East Asia are better at recognising East Asians, while those designed in Western countries are more accurate at detecting Caucasians.
Audit AI can detect this kind of bias to make the developers aware, but it will be up to them to correct it.
“From policing, to welfare systems, online discourse, and healthcare – to name a few examples – systems employing machine learning technologies can vastly and rapidly change or reinforce power structures or inequalities on an unprecedented scale and with significant harm to human rights,” wrote digital rights campaigners Access Now recently in a post.
Of course, some are making their algorithms discriminate on purpose.
What are your thoughts on Pymetrics’ bias-detecting tool? Let us know in the comments.
- » BlackBerry CEO believes it will be a decade before self-driving car prevalence
- » BlackBerry joins OmniAir Consortium to further focus on autonomous capabilities
- » Johan Krebbers, Shell: On why poor quality data won’t cut it in delaying your IoT initiatives
- » Windows 10 IoT Core Services arrives promising a decade of updates
- » Australia prepares for autonomous vehicles with draft laws