Hiring automation technology: rewriting the script

null

According to Pew Research Center’s 2017 Automation in Everyday Life survey, Americans are more concerned than excited about emerging automation technologies; many people don’t look forward to a world where machines do all of the thinking and work. Of all the technologies considered in the survey, the one people want least is hiring automation technology. 

In fact, only 22% of respondents reported feeling enthusiastic about the development of hiring algorithms and 76% of respondents said they would not want to apply for a job that used a computer program to select applicants, due mostly to beliefs that “Computers can’t capture everything about an applicant” and algorithms are “too impersonal”.  

This distrust of hiring automation is particularly striking when compared to attitudes toward things like “A future where robots and computers can do many human jobs” (33% enthusiastic) or “The development of robot caregivers for older adults” (44% enthusiastic). It seems people are more comfortable with the idea of a robot taking care of their grandparents than having a computer interview them for a job. 

Clearly, there is a “personal” element to the hiring process that people perceive as being important. As one survey respondent elaborated, “A lot depends on the sophistication of the program, but I do believe that hiring people requires a fair amount of judgment and intuition that is not well automated. I feel that I have a lot of skills that would be hard to quantify and require that kind of in-depth thinking and judgment.” 

This is not an unreasonable evaluation; machine learning and algorithms are based on a set of mathematical rules that are not designed to take “intuition” into account. They are, by definition, “cold, inhuman, calculating machines”. What is interesting is that people would rather be judged by human intuition than mathematical formulas, even though there are well documented problems with hiring managers relying on their intuitions or gut instincts during the hiring process. Namely, that humans have an inherent bias for people who are similar to them. 

Known as the Similar-To-Me effect, psychological research suggests that we favour people who look, think and/or act like us over people who don’t. For example, research has shown that employers are more likely to hire a candidate if the candidate is competent and more culturally similar, and that both black and white raters are willing to give higher performance ratings to employees of their same race. An increasing emphasis on “culture fit” as a hiring priority in recent years has also garnered much attention from experts who believe this criteria more often results in a fit between the interviewer and interviewee than the interviewee and the organisation. 

A major advantage of using technology is the ability to detect these types of bias. But the strong aversion people feel toward hiring algorithms and workplace automation in general suggests there is a level of misunderstanding and mistrust regarding this technology that business leaders must address. Here are a few methods to do so:

1. Emphasise a People + Technology rather than Technology – People philosophy

Could talent decisions that were once the responsibility of people eventually become the responsibility of machines? Probably. But most organisations’ adoption of machine learning is not about replacing humans with machines. It is about giving humans better data with which to make decisions. Companies exploring hiring automation technology should view technology as an aid to hiring decisions rather than a replacement to humans, and clearly communicate this philosophy to candidates and employees.  

2. Recognise that not all hiring decisions should be automated.

Automation can save time and increase decision accuracy. But efficiency and accuracy shouldn’t be the only factors in determining whether hiring automation is used. As Google’s head of People Analytics Prasad Setty describes, certain decisions are inherently more meaningful when organisational leaders are accountable for them. After Setty’s team developed a mathematical formula that could make (with 90% accuracy) the same promotion decisions as committees of people were making as a group, they expected committee members to be thrilled about the time and effort-saving solution. Instead, they found that committees hated the idea of hiding behind a formula saying, “For such important decisions as promotions, we don’t want to stand behind a black box and say that the formula made me do so. We want to stand behind these decisions.”  Important decisions about people should be made by people. Hiring automation can and should help guide staffing decisions, but it should not be viewed as taking decision-making responsibility from business leaders. This is critical to ensuring that employees and candidates feel fairly treated; employees believing that the process associated with decision outcomes is fair is an important component of procedural justice, which research has shown can positively influence job performance, job satisfaction, organisational commitment and organisational trust.   

3. Understand the limitations of automation and have appropriate oversight in place

Technology can be very useful for detecting inequities, but it is important to recognise that machines are not entirely immune from bias. In 2016, ProPublica released a story titled ‘Machine Bias: There’s a software used across the country to predict future criminals. And it’s biased against blacks.’ Researchers found that when the predictions made by COMPAS, a risk assessment software that forecasts criminal reoffender likelihood used by judges to make bail and sentencing decisions, were compared with actual reoffender data, black offenders were twice as likely as white offenders to have been labeled by the system as “high risk” but not actually re-offend while white offenders were much more likely to have been labeled as “low risk” but to go on to commit other crimes.

While this finding is not meant to suggest that hiring algorithms should be avoided because there is a possibility they will be biased, it does emphasise the responsibility organisations have to find an appropriate balance between people, automation technology and decision oversight.

The emergence of automation technologies in the workplace may be inevitable, but employees’ negative perceptions of these developments don’t have to be. It is up to business leaders to develop a sense of understanding and trust in employees about the role, value as well as limitations of automated technologies and the effect on decision-making processes in the workplace. Automation technology has tremendous potential to improve our world. But it is our responsibility to ensure that world is not one where all decision-making power is given to machines.

Lauren Pytel, Research Scientist, HCM Research at SAP SuccessFactors

Image Credit: Tim Gouw / Pexels