Can big data analytics really reduce recruitment bias?

Is it worse to intentionally discriminate against job candidates based on traits like race or gender, or to believe you aren’t discriminating when in fact you are?

This great ethical quandary is brought to you by the big data movement in recruitment, the supposed answer to discriminatory hiring practices – or so says Silicon Valley.

Big data’s bias-blind promise

Here’s what the hope is: Impartial, highly conditioned algorithms parse large datasets, then grade candidates, thus determining whether job-seekers are worthy of a hiring manager’s actual attention.

In theory, computerized objectivity saves candidates from discrimination and also frees hiring managers from the time inefficiencies of in-house recruitment. One study from ResumeterPro, a provider of applicant tracking systems, showed that hiring managers never see 7 out of 10 of the applications fed into these artificial intelligences.

But therein lies the problem: Big data algorithms are actually no more equipped to prevent recruitment discrimination than a flesh-and-bone hiring manager. Why?

1. ‘UNBIASED’ ALGORITHMS DEPEND ON BIASED PROGRAMMING

Recruitment AI learns from the data that fuels it, e.g. job applications and resumes that have yielded positive or negative results as determined by the employer. So what happens if, say, this software accidentally picks up a hiring manager’s dismissal of applicants with stereotypically African-American or Latino names?

A recent study proved that such prejudices are still rampant in today’s labor market. Presumed white candidates receive higher callback rates than presumed minority candidates. In this scenario, these discriminatory cues, however unconscious, will train recruitment AI to mimic that flawed decision-making and apply it over and over.

2. AI-POWERED RECRUITMENT LACKS AUDITING AND REGULATION

So then the question becomes how do we prevent imperfect programmers from plugging biases into otherwise equitable AI. But that’s just it – we can’t.

Right now, algorithmic auditing for recruitment analytics is something of a Wild West. There is no sheriff. There is no exemplar. A business would need a cadre of computer scientists at its disposal to examine inputs and run tests, and, call us crazy, that seems outside the financial scope of many companies.

And even if there was a way to perfectly monitor recruitment AI, we again find flawed logic. Instead of trying to build a system that coolly, accurately measures a candidate’s many facets and ignores those that don’t matter, what hiring managers really want is one supermetric for candidate fit – a single score that will dictate who to hire and who to avoid. Round peg, meet square hole.

3. NEITHER METHOD AVOIDS LIABILITY

True or false: If an applicant reports your business to the Equal Employment Opportunity Commission and investigators from your local Fair Employment Practices Agency finds biases in your analytics-based recruitment software, the fault lies with the third-party software designer and not the employer.

That’s false.

As the legal saying goes, ignorance of the law excuses no one. According to the EEOC, employers are always responsible for preventing discriminatory practices at their businesses and are thus culpable even when their vendors’ recruitment algorithms award or turn away candidates unjustly.

Another relevant saying: It’s a poor craftsman who blames the tools. As complex as machine learning might appear to the layperson, that intricacy does not vindicate users from blame when something goes wrong – no more so than if a hiring manager somehow accidentally trained her email inbox to filter out applications sent from anyone over the age of 40.

That’d be kind of impressive if it wasn’t totally illegal.

Recruitment experts know how best to utilize AI technology to achieve your goals and avoid discrimination.

Recruitment experts know how best to utilize AI technology to achieve your goals and avoid discrimination.

Moral of the story: Never completely rely on AI

Don’t let these horror scenarios turn you off to big data analytics on the whole. As we’ve examined before on this blog, when utilized intelligently, many cutting-edge solutions can streamline the hiring process and eliminate biases to a remarkable degree.

This technology, however, always requires a countervailing force, be it a trained in-house hiring manager or an outsourced recruitment expert. These people ought to know what data is discriminatory, why it’s discriminatory and how to avoid integrating it into your hiring practices. After all, lawmakers and regulatory bodies are watching.

In a similar vein, these professionals should also know what data points actually matter, and can thereby determine whether an AI-powered solution will deliver its intended results or merely perpetuate prejudices that are worthless to hiring, not to mention humanity.

At the end of the day, the right recruitment partner will know the best ways to leverage technology and the knowledge of a talented team of executive search specialists. That’s where we come in. Talk to the recruitment experts at EBC Associates today to learn about how you can hire with passion and intelligence.