Ward Off Hiring Bias with and within AI Recruiting Tools
In a world that values meritocracy, it’s essential to ensure that hiring practices are fair and unbiased. Not only is it the right thing to do, but it also impacts employer branding, workplace diversity, and the organization’s legal standing as well.
But hiring bias is real!
An NBER survey revealed “White-sounding” names get 50% more callbacks than “black-sounding” names. While 48% of HR managers themselves admit to hiring bias affecting their candidate choice . So, whether avoidable cases of unconscious bias in hiring or unfortunate cases of hiring discrimination – HR professionals need to be aware of the potential hiring biases in their recruiting process and proactively address them. And this is what we discuss in this guide to combat unfair hiring practices with AI recruiting tools.
We’ll help identify signs of unconscious bias in recruitment and how current AI hiring tools offer solutions to discriminatory hiring practices beyond blind recruitment. Recognizing that in the past, there’s been an outcry about AI bias in hiring (the infamous AI bias example of “Amazon AI hiring bias”) – we’ll see how leading HR tech firms have addressed bias in hiring algorithms. Lastly, we’ll answer some common queries related to hiring bias and its AI solution.
Before we kick off, let’s define some basic terminologies of bias and discrimination in recruitment.
What is Hiring Bias?
Hiring bias refers to the unconscious tendency of recruiters or hiring managers to favor certain candidates based on factors such as gender, race, age, educational background, or other personal aspects that do not necessarily contribute to the qualification required for the job.
Sometimes, hiring bias is interchangeably called hiring discrimination. But the two are different; and incorrect terminologies can escalate the issue of any unfair hiring practice into a spiral.
Hiring Bias vs. Hiring Discrimination
You know (from above) what hiring bias is. We can define hiring discrimination as the conscious and intentional exclusion or mistreatment of certain candidates based on protected characteristics such as race, gender, or religion.
Take this unfair hiring practice example: a hiring manager who tends to favor candidates who share the same alma mater, justifying that he or she recognizes the standard of their education/training first-hand. One may view it as an unconscious bias in hiring, while others may call it out as discriminatory hiring practices.
While only a thorough investigation can bring out the truth, it highlights that the exact criteria to differentiate hiring bias from hiring discrimination are different.
|Criteria||Hiring Bias||Hiring Discrimination|
|Intention||Unintentional or implicit, often rooted in unconscious biases and cultural stereotypes.||Intentional and often overt, motivated by prejudice or a desire to maintain a homogenous workplace.|
|Impact||Can limit diversity and inclusion efforts and lead to a less diverse workforce.||Can violate anti-discrimination laws and result in legal action and negative publicity for the organization.|
|Mitigation||Can be reduced through awareness training, structured hiring processes, and data-driven decision-making.||Requires a comprehensive anti-discrimination policy, enforcement of equal employment opportunities, and ongoing monitoring and evaluation of hiring practices.|
Hiring discrimination seems more of a straightforward violation of Labor Laws and has evident-based signifiers. But hiring bias at the workplace is tricky. Let’s try to look deeper.
How Hiring Bias Impacts the Workplace?
Hiring bias can have a subtle adverse impact on the workplace, like a lack of diversity in teams, friction in team dynamics, lowered employee morale, etc. It can even have a visible drop in the revenue of the organization.
Hiring bias can lead to a lack of diversity in the workplace, resulting in a homogeneous workforce that lacks different perspectives and experiences.
A lack of diversity and inclusion can limit innovation and creativity in problem-solving and decision-making processes.
Low employee morale
Employees who feel excluded or undervalued due to hiring bias can experience lower levels of job satisfaction and engagement, resulting in reduced productivity and higher turnover rates.
Negative company reputation
Companies that are flagged with discriminatory hiring practices in the market suffer damage to their reputation, resulting in difficulty attracting top talent and retaining existing employees.
The thin line between hiring bias and hiring discrimination can lead to legal consequences, including lawsuits and negative media coverage.
How to Identify Unconscious Bias in Hiring?
Valeria Stokes, Chief Human Resources Officer and Staff Diversity Officer for the American Bar Association highlights a critical point in the challenge of solving hiring bias –
“The problem is that many businesses and managers will say, ‘We don’t even need to talk about diversity because we’re only interested in hiring the best talent,’ and then they’re ignoring unconscious bias. What they assume is that there’s a level playing field for women, for people of color, for economically and socially oppressed groups — and that means those groups aren’t going to be given a fair shake.”
Clearly, identifying unconscious bias in recruitment is the first step to revert your organization’s ongoing unfair hiring practices. Here are five common ways HR professionals can spot potential unconscious bias in recruitment:
Review candidate selection criteria
Check if your criteria for selecting candidates are based solely on job requirements and not influenced by personal preferences.
Analyze applicant data
Look for patterns in data, such as disproportionate hiring of a particular demographic group.
Conduct blind recruitment
Remove identifiable information like name, address, and schools from resumes before reviewing them.
Expand talent sourcing pool
Diversify recruitment channels beyond employee referrals and job postings for candidate sourcing.
Train hiring managers
Provide unconscious bias training to your hiring managers to raise awareness and mitigate potential biases in the hiring process.
Having identified and resolved to solve bias in recruitment, you have to approach it smartly so as not to trigger uncalled-for attention from outside the organization. It can hamper your corrective efforts for unfair hiring practices. Also, any heated confrontation can cause an alarm within the organization. In the next section, we see how to go about it on different flanks.
How to Approach Bias in Recruitment?
Biases are typical, and they all get triggered in us in some way. When it comes to hiring bias, it’s generally triggered by language, location, level of education, or gender.
Companies can address the challenges of biased and discriminatory hiring practices to achieve internal hiring goals by working in tandem with people, processes, and tools. Here’s how to approach each of these aspects:
Build a team from different departments and backgrounds that recognize the biases in your company. Challenge this team to keep diversity in mind and act as gatekeepers if discrimination occurs.
This group should also watch out for biases in the processes and hiring tools used by the organization.
All companies have their go-to sources for finding talent. They visit certain job sites, colleges, or organizations to find ideal candidates. While turning to your usual reliable sources, also add in new ones. Try adding different job boards and placement agencies to your array of sources. Collaborate with your diverse hiring team on the areas of concern in your hiring, and then find places to address those areas.
It’s also important because, for your candidate sourcing tool or recruiting software to collect a deep and impartial talent pool, its sources need to be diverse in the first place.
Once you have the people and processes in place, it’s time to examine your hiring tool for any bias.
Take a close look at the information being fed to your technology to determine how that might introduce bias. For instance, look at the words and phrases that populate each job listing (e.g., “go-getter,” self-starter,” “nurture,” “aggressive”). Do those phrases lean toward a particular gender, race, or anything biased in any way?
Look at how sophisticated your recruiting software is to determine whether it and your team can work together to be stewards of diversity for your company. For instance, AI recruiting tools can cut through potentially biased keywords to look at the depth of someone’s qualifications in assessing their ability to perform a job.
Unite your team and processes with cutting-edge, innovative AI to bypass potential hiring bias.
Yes, the stress is on AI for hiring when it comes to the latest recruitment solutions.
How AI Helps Combat Unconscious Hiring Bias?
Companies often blame their lack of diversity on a shortage of qualified candidates. But it’s not about a talent shortage. It’s about capitalizing on your assets and identifying the large pools of diverse talent that are being overlooked and waiting to be discovered.
Basically, you won’t bring in a diversified pool of qualified candidates with unconscious bias in hiring hindering your efforts. And AI has the potential to level the playing field for all individuals, no matter their sex, race, age, or background.
Ted Greenwald, a Wall Street Journal reporter on artificial intelligence, says, “These [AI-powered] applications aim to analyze a vast amount of data and search for patterns — broadening managers’ options and helping them systematize processes that are often driven simply by instinct. And just like shopping sites, the AIs are designed to learn from experience to get an ever-better idea of what managers want.”
The idea that you can methodically stamp out unconscious hiring bias (also called implicit or hidden bias) has caught fire with tech companies because it’s relatively new, data-driven, and blameless – everyone is told they have it and can’t avoid it, so no one is singled out or gets defensive.
Automates blind hiring
AI hiring tools can draw unique insight from large pools of data, produce data-driven decisions, and accelerate the recruiting process to place candidates faster — all while eliminating our unconscious bias. So, it makes sense to embrace Intelligence Amplification, using technology to augment human intelligence and capability to eliminate human bias from the recruiting process.
Leads to emotional intelligence
The best part about AI is that it allows recruiters to focus solely on the human side of recruitment — empowering them to help and impact more people than they could without it.
As the competition for the right people has taken priority in the staffing industry, business leaders have to be more discerning in their candidate selection. AI enables them to find a new approach to assess an applicant’s likelihood of success and create a pool of diversified talent — quickly and objectively.
Glen Cathey, SVP of Global Digital Strategy and Innovation at Randstad, feels along the same line, “By speeding up the process and ensuring more relevant results, sourcing and recruiting pros can focus on connecting with candidates, engaging with them, and getting them hired. Instead of focusing on unbillable research, you can use AI to automate these mundane tasks to allow you to focus on your clients. So, instead of focusing on finding people, you can focus on recruiting people,”
It’s truly fascinating how AI recruiting tools are working to solve the problem of unconscious hiring bias. But it’s not all foolproof.
AI recruiting tools don’t operate in a vacuum. They need the eyes and ears of human expertise to build a diverse talent pool. Without this, you have cases like Amazon’s AI hiring bias that could give a wrong turn to the impression of unbiased AI hiring.
But is AI Recruitment Tool itself Unbiased?
Amazon had developed an AI-powered recruiting tool designed to review resumes and make recommendations about the best candidates to hire. However, it was found that there was a bias in hiring algorithms that favored male candidates over female candidates!
The bias was traced back to the fact that the AI training data consisted mostly of resumes submitted to Amazon over a 10-year period, and during that time, the majority of applicants were male. As a result, the algorithm learned to associate certain keywords and phrases more strongly with male candidates, and it penalized resumes that contained language more commonly used by women.
Amazon reportedly abandoned the tool altogether in 2018, but the infamous incident (termed “Amazon AI hiring bias”) highlighted the challenges of developing unbiased AI tools and the importance of careful data selection and testing in AI development.
So, unless algorithms are trained with data representative of candidates from all demographics (and in equal proportions), flaws in the algorithms have the potential to create biases in every step of the hiring process, from talent attraction to screening and beyond.
Taking it further, let’s look at how AI bias in hiring cause a talent shortage leaving behind a large pool of hidden workers.
The gap in predicting talent engagement
Insourcing talent for a role, recruiters use the predictive technology of AI to advertise open jobs to the right talent and identify qualified applicants. By using these algorithms to target candidates, predictions are made based on which applicants are most likely to click on the ad rather than which candidates would be the best fit for the job.
The issue with these predictions lies in how these job ads are delivered to potential candidates. In creating targeted ads on a social platform such as Facebook, biased AI systems can reinforce gender and/or racial stereotypes, among other biases.
Misinterpretation of past hiring data
One of the most popular uses of AI technology in the hiring process is the automation of screening resumes with algorithms.
AI technology is trained by humans and relies on past hiring data to identify features of a good or bad candidate. If prior biases existed within your organization, you’d run the risk of the system adopting specific attributes that are only common among your current workforce.
As an example, if most of your organization’s employees are males, the algorithm may learn from the existing data set to prefer male candidates over female, non-binary, etc. When an algorithm learns to filter candidates based on characteristics or identity traits rather than job competency, there’s a danger of exhibiting unintentional biases.
Interestingly, bias in hiring algorithms also stems from the fact that they primarily trained to separate candidates that are assumed to be “unfit for hire,” rather than pinpoint the applicants who would be most successful in the role. The result is an overlooked talent pool, also called “hidden workers.”
Suggested reading on hidden talent pool: Unlocking the Hidden Talent Pool
The authors of Hidden Workers: Untapped Talent  identify several AI bias examples that put diverse backgrounds of hidden workers on the back foot, including caregivers, veterans, immigrants/refugees, disabled, partners of relocated workers, neurodiverse individuals, less-advantaged populations, ex-convicts, and those without traditional qualifications.
They may also include:
- People who are underemployed and working part-time jobs while willing and able to work full-time positions
- Applicants actively looking for work but who have recently experienced extended unemployment caused by several reasons such as health issues or family care responsibilities
- Full-time workers who are not actively looking for a new position, but may consider a change given the right circumstances
A bias-free AI hiring tool that can tap into this overlooked category can lead to great benefits for employers. But given the past of AI bias in hiring, how do we go about it?
So, How to Combat Bias in AI Recruiting Tools?
From cases like Amazon AI hiring bias and others, it’s upon all HR tech developers to take a closer look at how companies should be careful with their AI-enabled hiring process, as algorithms can do more harm than good when it comes to your commitment to diversity, equity, and inclusion.
Suggested reading on hiring bias: Combatting Bias in Machine Learning Algorithms: For Recruiting
Let’s take a step back and ponder how did recruiting tools make us more aware of hiring bias and our need for diversity and equality?
Because technology is automating and repeating what we say, what we do, and what we learn. If it’s biased, we’re biased. If we’re biased, it’s biased. It’s a loop from which we quickly learned and identified gaps in our organizations and ethical guidelines. Without technology in the equation, those unhealthy patterns may never end. That’s a good thing. A starting point from where we can start to outline the bad and rectify faults, gaps, and misconceptions.
HR is in-charge
So, first, take in that we are in charge of AI recruiting tools.
AI is learning from us, for us. We don’t have to accept every answer it gives us or make a decision based only on its given recommendation. Our job is to collaborate with AI, letting it guide us and giving us the final decision-making power. If the results highlight bias in recruitment or cause discriminatory hiring practices to build into the processes, we can address the bias in hiring algorithms or programs accordingly.
AI is not know-it-all
We want technology to be just like us, but without our flaws.
Think about that for a second.
Why is it more acceptable that we have teams of employees making biased hiring decisions, but we’re horrified when a machine does it?
Because we’re human. We like humans. We trust humans. While machines are unpredictable and out of our control.
Or, at least, that’s how we see it.
For most ordinary people, machine learning lives in a black box. Most feel like they can’t understand its inner workings, so it’s difficult to trust and depend on for critical decision-making. According to Genpact’s recent study, 59% of employees believe they would be more comfortable with AI if they understood it better. Especially in a field like recruiting, when someone’s livelihood is affected.
So, either we accept that programs make the same unfair mistakes we would (way less frequently), or we stop using AI recruiting tools. The second one is not an option because AI will replace those who don’t use AI.
Mind the input in AI
Because it’s so difficult for us to recognize and understand our own conscious and unconscious biases, it’s even more difficult not to feed them into technologies. When that happens, they are then deeply embedded, relearned, and reinforced in a company’s decision-making.
Sankar Narayanan, Chief Practice Officer at Fractal Analytics Inc., says, “AI needs to be traceable and explainable. It should be reliable, unbiased, and robust to be able to handle any errors that happen across the AI solution’s lifecycle.”
So the question is, how do we feed quality data to minimize systemic hiring bias?
AI recruiting solution providers are beginning to use a body of work covering “fairness, accountability and transparency” in attempts to refine their systems so they produce “fair” results, as determined by “mathematical definitions”.
Trust in the company and its training process is a starting point, but experts say the real solution to black box AI is shifting to a training approach called glass box or white box AI .
Glass box AI modeling requires reliable training data that analysts can explain, change and examine in order to build user trust in the ethical decision making process.
Transparency in AI
In May 2019, the European Union released a standard guideline defining ethical AI use that experts are hailing as a step toward tackling the black box AI problem to create better decision-making software.
The guidelines will nudge businesses to start to trust the specialists and veer away from generalist services providers that have rebadged themselves as AI companies by requiring:
- Large companies that either reach 1 million devices or make $50 mil per year to do an automated systems impact assessment and data protection impact assessment
- Provide a detailed description of the decision-making system and the data used to train it
- Assess the risk that data impacts accuracy, fairness, bias, discrimination, privacy, and security
Most companies that advertise AI as the core of their solution don’t do a great job of explaining recommendation and decision-making criteria and strategies. Most of the time it doesn’t matter, but if compliance issues are at stake, a person’s career is on the line, and the company’s goals could be in jeopardy, a well-documented explainable AI decision-making should be open to all stakeholders.
Customized AI hiring tool
If you don’t define success from the very beginning, there will be no way to measure it down the road.
AI solution providers should ideally sit with their clients, outline their goals, and tailor the settings in the system to work for their business needs. You’ll discuss strategy and how to approach your recruiting process to keep hiring bias out of the equation.
So, how can organizations address bias in AI? Pay attention to the basics:
- Do you have enough adequate data (including data diversity) to train and test your algorithms? If not, augment more generalized data within your industry and gather competitive data.
- Gain a deeper understanding of the features the algorithms are working with and eliminate the obvious ones that could contribute to biases (like gender information).
- Identify the right algorithms that are in line with your organizational objectives. Some algorithms have better capabilities to deal with biases.
- Set the precedent of appropriate precision rules on top of models to achieve desired results.
- What is the deeper philosophy behind the models and AI? This could profoundly influence your selection of algorithms.
And last but not least, monitor, refine, and update.
Know your role
We see that AI recruitment solution providers have to discuss, teach, and represent their AI well by guiding users through the strategy and execution of their AI’s decision-making – or they may lose them completely. On the other side, there are things we can do to expedite the whole process:
- A willingness to make critical changes to our processes, routines, and gut instincts by investing in AI tools
- An understanding that technology isn’t simply a reflection of humans that make no errors
- A desire to dig deep into AI ethics and only work with vendors that reflect transparency, diversity, and security
Combatting bias in AI platforms won’t be a quick fix. It will take lots and lots of time to refine, test, and perfect. That’s why we must start small with approaching these systems. Once organizations incorporate their objectives and results-driven goals, they create better decision-making criteria for their team and develop stronger recruiters who trust in their data-driven results.
Over to You
People are your most important asset as a company. You can find the right people with the right data and AI-backed tools and techniques. But, whether with or without them, warding off hiring bias completely is only an ideal thought. Not practical. Still, as long as there is a balance between compliance and innovation, AI recruiting solutions will keep shining.
But finding the right AI recruitment solution to combat biased recruiting is critical. At Leoforce, we understand the need for data-driven hiring decisions that aim to reduce hiring bias. Our AI recruiting platform, Arya, keeps learning and improving, reducing human prejudice and boosting diversity initiatives.
See how exactly AI recruitment solutions can help with diversity at your workplace by requesting a personal demo.
What is an example of hiring bias?
An example of hiring bias is when a recruiter favors candidates who attended the same university as them, regardless of their qualifications.
What is the impact of hiring bias?
Hiring bias can impact the diversity of the workforce, limit talent pools, and perpetuate systemic discrimination.
What is the most common form of bias in the recruitment process?
The most common form of bias in the recruitment process is unconscious bias, which can lead to unintentional discrimination against certain groups.
What is an example for affinity bias?
An example of affinity bias is when a hiring manager favors a candidate because they share a similar background or interests.
How do you identify hiring discrimination?
Hiring discrimination can be identified by analyzing the demographic data of the candidate pool and evaluating the consistency and objectivity of the selection process.
How can you avoid discrimination in the recruitment process?
Discrimination in the recruitment process can be avoided by creating clear job descriptions, utilizing objective selection criteria, and implementing training on unconscious bias for recruiters and hiring managers.
What are the 3 types of bias?
The three major types of bias are cognitive bias, social bias, and structural bias.
How artificial intelligence can eliminate bias in hiring?
Artificial intelligence can eliminate bias in hiring by using algorithms that are trained on diverse data sets, removing identifiable characteristics from resumes, and conducting regular audits to ensure fairness.
What are the issues with hiring algorithms?
Issues with hiring algorithms include perpetuating systemic biases, lack of transparency, and potential legal challenges.
How to reduce bias in hiring algorithms?
Bias in hiring algorithms can be reduced by including diverse data sets, using transparent algorithms, and conducting regular audits to evaluate their effectiveness.