Predicting a Less Diverse Workforce: How Predictive Analytics Introduce Bias in the Hiring Process

By: Leigh Harvis

Predictive analytics have been used to predict what someone will buy, who they will date, what TV show they should watch next, and who they should hire. Predictive analytics work by looking at historical data to predict trends and behavior.[1]

The predictive analytics market will be worth $10.95 billion by 2022.[2] This is a 21% increase over the last five years.[3] Predictive analytic solutions are used throughout the hiring process, from initial recruitment to the offer stage.[4] While these tools may seem like an efficient and effective way to hire, there are concerns that the algorithms used in predictive hiring are introducing bias.[5]

For example, Amazon developed a recruitment tool that penalized resumes that included the word “women’s.”[6] The problem? The team that was developing the tool taught the computer models to focus on resumes that included words like “executed” or “captured.”[7] These words are primarily found in the resumes of male engineers.[8]

Other development companies like HireVue use artificial intelligence to predict job success based on how candidates respond to pre-selected interview questions.[9] The companies on-staff psychologists have formulated customized algorithms that assess traits and facial movements which they believe correlate to success in certain roles.[10] In 2019, the Electronic Privacy Information Center filed a complaint with the Federal Trade Commission, claiming that HireVue’s use of facial recognition software to assess candidates was biased.[11] They argued that facial recognition presents bias in candidates with psychological issues and from different racial backgrounds.[12]

In 2019, the Algorithmic Accountability Act was introduced as a way to foster accountability for companies that use “automated decision systems.”[13] The Act would “. . . require all corporations that use ‘automated decision systems’ to submit impact assessments of the accuracy, fairness, bias, discrimination, privacy, and security of their automated decision-making systems to the Federal Trade Commission (‘FTC’).”[14] However, the bill failed and never made it past the first committee in the House.[15] Within the coming months, there is a plan to reintroduce the bill.[16] The hope being that the Biden administration will prioritize technical innovations, leading to a different outcome for the bill.[17]

The regulation of artificial intelligence tools for hiring remains an ongoing conversation which creates gaps in how organizations hold themselves accountable when hiring a diverse workforce. Therefore, it is critical that both developers and hiring managers consider the adverse impact risks when using predictive analytics solutions to hire.

[1] John Edwards, What is Predictive Analytics? Transforming Data into Future Insights, CIO (Aug. 15, 2019, 3:00 AM),

[2] Press Release, Zion Mkt. Rsch., Trends in Predictive Analytics Market Size & Share will Reach $10.95 Billion by 2022 (Mar. 2, 2018),

[3] Id.

[4] Miranda Bogen, All the Ways Hiring Algorithms Can Introduce Bias, Harv. Bus. Rev. (May 6, 2019), BB 18.1

[5] Id.

[6] Jeffrey Dastin, Amazon Scraps Secret AI Recruiting Tool that Showed Bias Against Women, Reuters (Oct. 10, 2018, 7:04 PM),

[7] Id.

[8] Id.

[9] Rebecca Heilweil, Artificial Intelligence will Help Determine if You Get Your Next Job, Vox (Dec. 12, 2019, 8:00 AM),  

[10] Id.

[11] Complaint & Request for Investigation, Injunction, & Other Relief Submitted by the Electronic Privacy Information Center (EPIC) at 12, In re HireVue, Inc. (F.T.C. Nov. 6, 2019),

[12] Id. at 7-8.

[13] Yifat Nahmias & Maayan Perel, The Oversight of Content Moderation by AI: Impact Assessments and Their Limitations, 58 Harv. J. on Legis. 145, 149 (2021).

[14] Id.

[15] Grace Dille, Sen. Wyden to Reintroduce AI Bias Bill in Coming Months, MeriTalk (Feb. 19, 2021, 9:00 AM),

[16] Id.

[17] Id.