IIM ROHTAK HUMANE-R PRESENTS IMPRESSIONS July 2022 HUMANE.R@IIMROHTAK.AC.IN
"HR Technology- Biases in Recruitment"
Tech embarks on prejudice in hiring Amplification or prevention of bias in recruiting practices? The answer to this fundamental query is more challenging to come by than it first looks, and it has become a source of contention amongst supporters and detractors of the technology. Hiring is typically the result of several more minor, earlier actions. Throughout this process, algorithms play a variety of functions. Some direct job advertisements to specific prospects, while others highlight passive candidates for recruiting. Predictive technologies analyze and rank resumes and assist hiring managers in evaluating candidate competencies in fresh ways by combining old and new data. Experts believe that by bringing consistency to the recruiting process, algorithms will aid human decision-makers in avoiding their own biases. However, algorithms also carry new dangers. They can reproduce institutional and historical prejudices and existing magnifying disadvantages in data points like performance evaluation results or university enrolment. Although some subjectivity may be eliminated from the hiring process by algorithms, people are still heavily involved in making the ultimate hiring decisions. Arguments that portray "objective" algorithms as more impartial and reliable than imperfect people fail to acknowledge that both typically play a role. Research demonstrates that AI is frequently biassed. Here are some tips for making algorithms work for everyone.
Reference: The World Economic Forum
We must investigate how predictive technologies function at each stage of the employment process to understand bias in hiring algorithms and how to minimize it. Although they frequently have machine learning at their core, the tools utilized at different process stages may differ significantly. Even systems that seem to accomplish the exact job may use entirely unrelated data sources or offer predictions in wildly different ways. Our examination of prediction tools used throughout the hiring process sheds light on the functions of "hiring algorithms" and where and how bias can be introduced. Unfortunately, we discovered that most recruiting algorithms automatically lean toward bias. While their potential to lessen interpersonal discrimination shouldn't be underestimated, the only tools that can give rise to any hope that predictive technology might assist promote equity rather than undermine it are those that proactively address underlying differences. Selecting the best candidates Before a job seeker applies, the hiring process has already begun. Predictive technology aid in the job posting, alerting job seekers to potentially interesting vacancies and surfacing potential applicants to recruiters for proactive engagement during the "sourcing" or recruiting stage.
To reach the most "relevant" job seekers, many firms use algorithmic ad platforms and job boards to entice applicants. These algorithms, which ensure corporations make better use of their hiring dollars, frequently provide flimsy predictions: they don't predict who would be successful in the role; instead, they predict who will click on that job posting. Even when employers have no intention of doing so, these assumptions may cause job advertisements to be conveyed in a way that promotes gender and racial prejudices. In a recent study that we co-authored with colleagues from North-eastern University and USC, we discovered, among other things, that broadly targeted Facebook ads for supermarket cashier positions were shown to an audience that was 85% female. In comparison, jobs with taxi companies went to a roughly 75% black audience. This is a prime illustration of how an algorithm might inadvertently reproduce prejudice.
Reference: Riach & Rich (2006)
Personalized job boards like ZipRecruiter try to learn recruiters' preferences automatically and use those predictions to find applicants who share those tastes. Similar to Facebook, these recommendation systems are created with the intent of identifying and duplicating patterns in user behavior. Predictions are updated dynamically when companies and job seekers communicate. The system might find proxies for those traits (like having the
name Jared or playing high school lacrosse) and duplicate that trend if it considers that recruiters interact more frequently with white men. This kind of negative effect is possible without specific guidance, and worse, without anybody being aware. Most people probably don't immediately think about sourcing algorithms when they hear the phrase "hiring algorithm." But automated decisions are frequently used at this point in the employment process. For instance, the tool that Amazon abandoned because it disadvantages women wasn't a selection tool to evaluate actual applications; instead, it was a tool to assist recruiters in finding passive people to approach. The law scholar Pauline Kim has stated that while sourcing algorithms may not explicitly reject candidates, "not telling people of a job offer is a highly effective obstacle" to job seekers. These tools are crucial in defining who gets access to the employment process, even though they may not necessarily make for dismal headlines. Streamlining the funnel When applications begin to arrive, businesses try to concentrate on the best applicants. Although the algorithms utilized at this point are frequently presented as decision support tools for hiring managers, in practice, they have the potential to reject a sizable portion of applicants automatically.
Reference: The World Economic Forum
Some of these screening algorithms are outdated methods that have been given a technological makeover. Historically, employers used "knockout questions" to determine whether applicants met the minimum requirements; today, chatbots and resume processing programs do this duty. Other systems go even farther, allegedly reducing the impact of human prejudice while saving companies time by employing machine learning to predict future screening outcomes based on prior selections. It can appear logical at first to look for screening systems to simulate previous employment decisions. But those choices frequently mirror the distinct tendencies that many organizations are working hard to alter through initiatives for diversity and inclusion. Other selection techniques use machine learning to forecast which candidates will "succeed" in the workplace, which is frequently determined by tenure, productivity, or performance (or by the absence of signals like tardiness or disciplinary action). Employers are said to be able to generate predictions using more subtle indications, like gameplay or video analysis, using more recent techniques in this field. In the United States, these selection processes are governed by customary laws. Employers are accountable for adopting strategies that disproportionately favour one group of applicants and are required to check their evaluation tools for adverse effects against demographic subgroups. Many assessment providers go to great lengths about the procedures they take to "de-bias" their algorithms, steps that also guarantee their clients are compliant with the law. However, separating great performers from low performers sometimes involves making subjective judgments, a well-known cause of discrimination in the workplace. Debiasing a hiring algorithm created from performance data tainted by sexism, racism, or other systemic biases is like applying a poultice to a gaping impairment. Employers can readily defend adopting a selection algorithm that produces unfair results if they can show that their tool serves a specific commercial objective, which is a pretty low hurdle.
Although certain industrial-organizational psychologists, who are frequently involved in the creation of recruiting practices, are wary of basing new selection methods purely on theoretical correlations, there is currently nothing in the regulatory rules that mandates companies to do much more. Additionally, predictive technology is designed to help employers make offers that candidates are likely to accept after choosing a candidate to hire. Such methods could get around restrictions barring employers from explicitly asking candidates about their compensation histories, locking in long-standing patterns of pay disparity, or at the very least making it more challenging to reform them.
Hiring algorithms that take equity into account While some restrictions on businesses utilizing predictive hiring tools under current U.S. law, these restrictions are inadequate to address the expanding hazards posed by machine learning-enhanced employment tools. How, therefore, can we be sure that hiring algorithms do support equity? There are undoubtedly essential roles for industrywide best practices and slow-moving regulation. While this is going on, companies employing predictive hiring technologies and providers creating such systems must look beyond the bare minimum of legal compliance. They must consider whether their algorithms lead to more equitable hiring practices. Before using any predictive technology, they should assess how subjective success measurements may unfavourably affect a tool's predictions over time. Employers should monitor their pipeline from beginning to end to find areas where latent bias is present or remerges, going beyond simply looking for adverse effects during the selection step. If those praising the ability of algorithms to minimize hiring bias don't actively develop and test their tools with that objective in mind, the technology will, at best, have difficulty living up to that promise and, at worst, may even work against it.
Reference: AI & Ethics 2022 (Lorenzo Belenguer)
We must investigate how predictive technologies function at each stage of the employment process to understand bias in hiring algorithms and how to minimize it. Although they frequently have machine learning at their core, the tools utilized at different process stages may differ significantly. Even systems that seem to accomplish the exact job may use entirely unrelated data sources or offer predictions in wildly different ways. Understanding what "hiring algorithms" actually perform and where and how bias can enter the process is made easier with the help of an examination of prediction tools used throughout the hiring process. Unfortunately, prejudice will naturally incline most hiring algorithms. While their potential to lessen interpersonal bias shouldn't be underestimated, the only tools that can give rise to any hope that predictive technology might assist promote equity rather than undermine it are those that proactively address underlying differences.
Kalpesh Khandare Symbiosis Institute of Management Studies, Pune
“Acquiring the right talent is the most important key to growth. Hiring was — and still is — the most important thing we do.” – Marc Benioff
It is beyond doubt, that to get into any organisation, the HR Manager plays a pivotal role in the recruitment of the candidates- by preparing job descriptions, source activation, screening and shortlisting resumes, conducting and reviewing performance by an aptitude test, conducting interviews (personal interviews and group discussions) and evaluation. The interviewee prepares to get himself enrolled in the organization (from preparing the CV to preparing for the interview). Everyone expects that, an HR Manager will recruit a suitable candidate with fair judgement and hire a candidate based solely on their ability to perform well. It will not only provide an opportunity to the suitable candidate but also assist in the growth of an organization. At times, it is difficult to determine whether a candidate is the appropriate fit or not. In addition, the interview process is all about judgement. The HR Manager relies heavily on their judgement, which is almost unconsciously biased.
We cannot entirely blame for hiring biases, as we are trained by birth to make decisions on the basis of certain criteria and it is ingrained subconsciously. Additionally, bias occurs or happens because our brain is looking for shortcuts to help us make decisions instantly. An HR Manager can hire on the basis of appearance, gender, body language, common interest, family background, gut feeling/instinctive feeling, first impression, race, religion etc. The hiring bias is classified into the following: Confirmation bias (The Blind Manager Bias) In this type of biasness, the interviewer only verifies the points that they expect of the candidate in accordance with his/her CV. Here, we can say that recruiters are concerned about an idea that they have of the candidate being a preconception that comes from a CV.
Affect heuristics This is a situation, where the recruiter reaches to a conclusion about a candidate’s ability to do the work, without thoroughly inspecting or reviewing all of the evidence beforehand. They directly judge the suitability of the candidate for the position. This could lead to the hiring of the inappropriate candidate, as the appropriate candidate may not be given the opportunity to prove himself or herself.
Gender and racial bias This is generally a common type. Gender and racial bias occur when a recruiter finds candidates from a certain gender or caste suitable for the position. In this case, the recruiter has a fixed view or an opinion about a certain gender or race.
Similarity bias Similarity bias is a type, in which the recruiter shares similar traits/interests with the candidate and on the basis of similarity, they are likely to recruit/hire the candidate. They focus more on the similarity rather than the capability or the candidate’s qualification. This similarity bias usually influences recruiters, as a human, we like to surround ourselves with people who are more similar to us. Nonverbal bias Apart from verbal communication, everyone has a typical
way
of
non-verbal
communication.
Sometimes, the interviewer emphasises more on non-verbal aspects. However, this kind of biasness could be detrimental to the people coming from lower strata, as they are not trained in the nonverbal areas. Sometimes, candidates from myriad cultures may not have the body language preferred by the interviewer.
Halo bias (Poor hire bias) This is a type of biasness, wherein the recruiter hires the candidate on the basis of a single positive trait, activity or skillset which overshadows all the other ones. On the basis of that positive trait, the recruiter assumes that the candidate can perform all the tasks assigned to him efficiently.
Horn bias This is said to be exactly the opposite of halo bias. Meaning, that horn bias is a type of bias in which the recruiter only emphasis solely on a single mistake/one negative trait. He judges/presumes that the candidate is inappropriate for the position and does not know anything. This leads to unfair prejudice towards the candidate.
Expectation anchor bias Expectation anchor bias occurs when recruiters allow themselves to anchor onto one certain piece of information about a candidate and use it to help them making decisions.
First impression bias This is similar to non-verbal bias. Here, the recruiter presumes and recruits on the basis of first impression. If the candidate seems to be confident then the recruiter decides to hire that person. Even, we have heard the fact that a recruiter decides in the first 10 minutes whether the candidate is to be hired or not.
Intuition/ Overconfident bias Sometimes, recruiters make decisions on the basis of gut feeling. They intuitively recruit the candidate on the basis of Emotional Quotient (EQ) or factors like emotion and intellect, rather than his/her capabilities. Overconfidence bias is similar to intuition bias or we can consider both the bias as the same. In overconfident bias, the recruit is confident that they have selected the appropriate candidate.
Contrast bias It is the biasness in which the candidates appearing for the interview are being compared among themselves.
Beauty bias This is a type of bias in which the recruiter selects and hires on the basis of appearance and demeanour. They assume that a well-groomed or good-looking candidate is suitable for a certain position. Moreover, they think that this kind of candidate will represent the organisation in a better way and will elicit a positive impression about the organization.
However, there are various ways to mitigate unconscious bias during the recruitment process as elucidated above: 1.Using scoring criteria: Nowadays, many organizations introduced this concept, in order to reduce unconscious bias while recruiting. This obviates in eliminating several kinds of bias and helps
in selecting on the basis of traits required. This strategy brings clarity to the decisionmaking and assists in recruiting potential candidates, who are appropriate for the position. 3.Providing training to the recruiter: Training should contain a few things which include: How to avoid asking questions, irrelevant to the position, rather focus more on asking questions related to past experience. Keeping an open mind to avoid judging on the basis of appearance and non-verbal aspects. Should not discuss personal interests. Should analyse the job satisfaction and attitude of the candidate. 4.Conducting anonymous test: Here, the name is not mentioned, instead, the code is written. This way, there is a lesser chance of judgement/biasness and the recruiter rather focus on the assignment assigned. 3.Multiple recruiters: This will reduce the chances of biasness. Moreover, all the recruiters can ask relevant questions. This helps in deciding whether the candidate is potential for the position or not. By opening up the pool of interviewers, you are allowing for more skills to shine through. 5.Taking notes throughout the interviews: This broadens the decision-making ability by providing the information in an accurate and concise manner. It gives the interviewer a clearer view of the capability of the candidate. This helps in decision-making and ensures recruiting the potential candidate. The recruiter can identify by comparing the skills needed for the particular position vis-a-vis the skills of the candidate. 6.Minimising irrelevant discussion: The recruiter must stick to focusing on extracting the relevant information from the candidate. This will minimize the chances of similarity bias
7. Introducing Artificial Intelligence (AI) based Technology in the Organization: There is no doubt that the 21st century has witnessed the use and importance of Artificial Intelligence (AI). It could play a vital role in the Human Resource domain. It will reduce the additional workload of HR Managers by collecting the data of the candidate from their respective CVs and thereafter, using that data for sourcing or to sorting the relevant characteristics should be in an appropriate or suitable candidate. The introduction of Artificial Intelligence in the realm of Human Resource will be considered as the modern era of Human Resource. E-recruitment will transform the process of hiring by contributing to selecting the appropriate fit, and attracting suitable candidates (even from across the country or around the globe). This AI-based technology is already being introduced into several esteemed or leading companies such as L’Oréal, Deloitte Cisco, McDonald’s, Johnson and Johnson, etc.
CONCLUSION As stated above, there are myriad of bias. All the biasness results in interfering in selecting and recruiting inappropriate candidates, hindering the productivity of the organization. In other words, hiring bias is a barrier to reaching potential. To avoid such bias, the recruiter can keep a few things in mind, stated as follows: 1.Recruiter must take care of the knowledge, skill set and experience gained by the candidate. 2.Candidate must be willing and must have a positive attitude towards his task and learning more about his field. 3.Recruiter must also focus on personal attributes like job satisfaction, job success, or job failure. 4.Companies could also introduce with Artificial Intelligence based technology, in order to mitigate biases in the recruitment process. In a nut-shell we can say that: “You’re not just recruiting employees, but are sowing the seeds of your reputation.”
By Ananya Modak Shri Ramdeobaba College of Engineering and Management, Nagpur