AI Bias Audit

AI Bias Audit

The Responsible Use of AI Begins with Understanding the Potential for Unconscious Bias within Your Automated-employment Decision Tools (AEDTs)

Kanarys AI Bias Audit

Our approach provides an independent, objective audit that assesses whether the use of AEDTs results in adverse or disparate impact on the basis of race, gender, ethnicity, age, and intersectional identities. The results are provided in a report along with recommendations tailored to address the specific needs of your organization and ready for public disclosure.​

AI is increasingly being used in the HR processes, but it is important to be aware of the potential for bias. AI can be a powerful tool in the workplace as employers increasingly utilize these tools in an attempt to save time, enhance productivity and increase objectivity. But it also has the potential to introduce bias in employment decisions, including recruitment, hiring, retention, promotion, transfer, performance monitoring, demotion, dismissal, and referral.

As more employers use automated, algorithmic and artificial intelligence screening and decision-making tools in the hiring and promotion process, organizations must proactively understand and assess the potential for AI bias and the steps needed to mitigate it.

Helping Your Mitigate AI Bias Across the Entire Employment Decision-making Process

Avoiding bias in employment decisions will help minimize legal risks to protect your reputation and help avoid legal liability.​ Having a third-party independent and impartial assessment is useful in validating and furthering your organization's commitment to minimizing bias in the employment decision-making to build a representative organization.​

Kanarys will conduct an independent bias audit to examine whether your system has unintended bias against any protected categories, such as gender, sex, ethnicity, or race.

Kanarys will collate the results of the AI bias audit report into a comprehensive summary of findings, which will include the selection rates, performance management, promotions among others and impact ratios of protected categories.

Our team of data scientists, I/O psychologists, and software engineers will provide you with ongoing training and insights to prevent, detect, and mitigate AI bias issues as your system evolves, ensuring that your AEDTs remain unbiased.

How Kanarys' AI Bias Audit Works

Confirm

From resume scanners and video interviewing software to chatbots and performance testing and “cultural fit”  applications, employers rely on different types of software that incorporate algorithmic decision-making at a number of stages of the employment process. Our team will work with you to confirm and identify the AEDTs that your organization uses to manage employment decisions. We will then collaborate with you to understand your internal processes.

Set

Set statistical significance thresholds for impacted groups, including race, gender, ethnicity, age, and intersectional groups, to ensure that we evaluate your organization’s policies, procedures, processes, and practices for their potential adverse impact on the basis of race, color, religion, sex (including pregnancy, sexual orientation, and gender identity), or national origin.

Assess

Our team will partner with your organization to conduct an AI bias audit of algorithmic decision-making tools across candidate data, talent pipeline, performance management, promotions, and other areas. We will use benchmark data to identify any potential AI bias.

Compare

We conduct an analysis of the data and compare your organization across a number of key areas including candidate pipeline, performance indicators, and employee retention vs. national and regional benchmarks​.

Recommend

We provide a detailed report on the systems audited, the adverse impact analysis method and results, and customized recommendations for mitigations as required for public disclosure.**

The Kanarys Team

Kanarys has a proven track record of helping organizations assess organizational practices from hiring and interviewing practices to AI bias in HR technology. We have collaborated with Fortune 500 companies, chambers of commerce, trade associations and other groups.​ Our interdisciplinary team of data scientists, I/O psychologists, and software engineers work closely with organizations to assess, analyze, and implement solutions to ensure that they are not introducing unintended bias through HR technology.

** Additional Resources

Are you ready for the NYC Bias Audit Law? The Local Law 144 (NYC Bias Audit Law) was enacted by the New York City Council in November 2021. As of July 2023, companies are prohibited from using automated tools to hire candidates or promote employees, unless the tools have been independently audited for bias. Guidance from the EEOC goes even further to include recruitment, hiring, retention, promotion, transfer, performance monitoring, demotion, dismissal, and referral. These regulations are likely to affect hundreds of organizations within the city and may include your own.

 

Employers are required to make publicly available on their website a summary of the most recent bias audit results for the AEDT, along with the distribution date of the tool. This disclosure should be made prior to the use of the AEDT, allowing transparency and informing individuals about the potential biases associated with the tool.

Level up your productivity

Get started today and improve your workflow.

Skip to content