Noni Williams: In push to fight unconscious bias with technology, do not fear failure, fear complacency

This thinkpiece by Noni Williams forms part three of SPN’s three-part series investigating unconscious bias in the hiring process (be sure to read part one and part two). Williams is a Data Analyst at Kiewit with a passion for automation and the use of data to solve complex problems. Here, she analyzes the advantages and…

Photo courtesy of Noni Williams

This thinkpiece by Noni Williams forms part three of SPN’s three-part series investigating unconscious bias in the hiring process (be sure to read part one and part two). Williams is a Data Analyst at Kiewit with a passion for automation and the use of data to solve complex problems. Here, she analyzes the advantages and drawbacks of relying on software to short-circuit unconscious bias among recruiters and hiring managers. The views Williams expresses are her own and do not necessarily reflect the views of SPN.

*

As we attempt to gain a foothold on the fault line where pandemic meets civil unrest, the most unifying act we can engage in is critical analysis. 

We often talk about software, especially software that boasts the use of artificial intelligence, as a counterweight to the unconscious (and often conscious) biases we carry. It’s AI as a metaphorical balancing of scales, AI often crafted by humans on homogenous teams and trained on datasets that do not resemble the true makeup of the populations it aims to support. 

Let us consider for a moment a software solution that purports to reduce bias in the hiring process and does not learn our prejudices for certain groups of people over others. Enter Jobecam—a Brazilian-based, “100% digital recruitment platform” that claims to be as useful for employers recruiting new talent as for job seekers desiring a neutral experience in their search. 

Jobecam claims to be faster, fairer and more efficient for candidates, especially with the company’s offer of a blind video interview. 

Here’s the idea: a job seeker records their answers to questions posed by recruiters. The recruiter then receives these interviews with audio and video altered “so that unconscious biases are eliminated.” 

This is a big claim backed with little evidence. And so ensues my barrage of rhetorical questions.

  • Are platforms like Jobecam requiring business to demonstrate that they have provided unbiased interview questions?
  • Have these businesses provided unbiased job descriptions?
  • Have these platforms provided evidence of their own research that demonstrates their understanding and determination of the nature of a biased question?
  • When recruiters are looking at responses to their completely unbiased questions, are they looking for responses that confirm their own cultural biases, or even biases in logic and problem solving?
  • If recruiters are looking for specific words in job seeker interviews, can they honestly be honoring the idea that cognitively diverse teams are generally better at solving problems?

To be clear, this is not an arrow pointing at Jobecam and their efforts to save businesses money in their recruitment budgets by centralizing processes and reducing the amount of redundant tasks that human resource teams often engage in to complete their searches for new hires. This is a call for us all to be more critical of our use of tools and platforms that claim to eliminate bias for us, giving us a false sense of moral superiority without having to commit to any of the work.

Let us assume we could achieve a completely unbiased hiring process by buying in to some software running a campaign full of buzzwords that feel good. When a new hire arrives at the workplace, what environment will they be met with? Have we done the work to remove biases from our office culture? Considered what it really means to be equitable, accessible and inclusive? 

Or did we just pay for a platform that lets us placate ourselves with platitudes we can boast about in quarterly budget meetings without having to enact any real change in our workspaces?

Ultimately, the solution to “eliminating bias” is not more software. The onus shouldn’t be on some body of code to save you and your company from looking bad. 

To be sure, identifying bias is multifaceted work and should reference more than your company’s (or industry’s) specific hiring practices and general office culture. It should include relevant history, the systems existing within your respective communities, and your relationship to said history. 

The solution is to learn what it means to be good. To not fear failing, because you WILL fail frequently. To try anyway, because that is the decent thing to do. 

 

This story is part of the AIM Archive

This story is part of the AIM Institute Archive on Silicon Prairie News. AIM gifted SPN to the Nebraska Journalism Trust in January 2023. Learn more about SPN’s origin »

Channels: , ,

Get the latest news and events from Nebraska’s entrepreneurship and innovation community delivered straight to your inbox every Wednesday.

Subscribe

Silicon Prairie News
weekly newsletter

Get the latest news and events from Nebraska’s entrepreneurship and innovation community delivered straight to your inbox every Wednesday.