Mysterious algorithms, black-box AI recruiters are binning our résumés
The software separating jobseekers and jobs remains a secret
Analysis When you submit a résumé for a position at a large company, you may or may not be contacted for further information or an interview.
Either way, you probably won't know why. Applicant tracking systems (ATS), the software used by employers to manage employment applications, are not generally open to public scrutiny.
"Unless the government or some individual brings legal action against an employer such that a court will give them access to it, no one looks inside ATS protocols," said Peter Cappelli, a professor of management and human resources at the University of Pennsylvania's Wharton School and the author of Why Good People Can't Get Jobs, in an email to The Register.
These algorithms are also not very good.
James Hu, cofounder and CEO of Jobscan, a company that helps employees improve their chances of being hired, in a phone interview with The Register, said, "Every ATS is a little bit different. Some don't have search functions, some do. The way they rank candidates is old-school."
By that, Hu means the systems have shortcomings. For example, he said, they often rank by exact keywords, so the software may miss plural words if it is looking for a singular term. Or when the software encounters "analysis" in a résumé, it may not recognize that the job seeker has experience with "analytics." Hyphens and columns can also present problems, he said.
In a 2014 article published in Career Planning and Adult Development Journal, Robin Schlinger, a résumé advisor, also notes that ATS software is highly variable. Some applications, she said, can read only text files or specific versions of Word, while some cannot read special characters.
Navigating ATS software has given rise to an industry that echoes the search engine optimization business. Those seeking employment are routinely advised to craft their résumés for machines and to pepper their résumés with the specific keywords in job descriptions.
"Job seekers should definitely be aware of the fact that these systems are being used and ensure that their résumés are customized to each they are applying for," said John Reed, senior executive director for staffing firm Robert Half Technology in an email to The Register. "Take note of keywords that are used in the description and think like a hiring manager. For example instead of 'experience with Java' you may want to say 'Java Development' or 'Java Developer.'"
Applicant tracking and talent management technology affects a great many job applicants. As many as 75 per cent of recruiters and talent managers rely on ATS, according to Capterra. And that figure may rise. The ATS market is expected to grow in the US at a CAGR of 7.58 per cent from 2016 through 2020, according to consultancy Technavio.
At the same time, the lack of transparency into how software makes decisions that affect people's lives has become an issue of civic concern, as people contemplate the risk posed by automated cars.
In its recently published guidance on preparing for a world full of software-driven automation, the National Science and Technology Council said, "As the technology of AI continues to develop, practitioners must ensure that AI-enabled systems are governable; that they are open, transparent, and understandable; that they can work effectively with people; and that their operation will remain consistent with human values and aspirations."
ATS software can be expected to get "smarter" as it incorporates AI-related technology like machine learning, as this employment opportunity at Oracle's Taleo demonstrates. Yet it won't necessarily become more open, transparent, and understandable.
Raising the alarm
The issue came up this week at a hearing held by the US Equal Employment Opportunity Commission, "Big Data in the Workplace: Examining Implications for Equal Employment Opportunity Law."
In prepared testimony, Ifeoma Ajunwa, a Fellow at the Berkman Klein Center at Harvard University and an Assistant Professor at the University of the District of Columbia School of Law, said that there's legitimate concern about the way data about employees gets used in employment-related decisions.
"As so much of the activity behind internet-scraping and machine-learning is not known – or knowable – to the people designing the programs, it is not always foreseeable that a certain algorithm will access and deploy prohibited information in its decision-making processes," Ajunwa said.
In a phone interview with The Register, Frank Pasquale, professor of law at the University of Maryland, observed that if the data used to train machine learning systems has problems, those issues may be reflected in the decisions the system makes.
"It should be a requirement that any person applying [for a job] should be able to know what data is going in and what algorithms are parsing it," said Pasquale. "Otherwise it becomes the perfect black box to hide all sorts of discriminatory behavior."
Christa Manning, VP and HR solution provider research leader at Bersin, part of Deloitte Consulting, in a phone interview with The Register, said that applicant tracking systems, while still relevant, have become less so in recruiting. "Talent acquisition has evolved beyond that," she said, noting that more attention is going into the process before people apply, what she calls "candidate relationship management."
For large companies, Manning said, there are often too many applicants to deal with. "You really only want a few applicants to a job, and you want the best ones," she said. "It's a totally different world than when applicant tracking systems were developed."
With unemployment around five per cent in the US, most of the people companies really want have jobs, said Manning. So large firms often try to entice prospects working at other companies to consider new opportunities through social media and other forms of outreach. To nurture relationships with desirable job applicants, companies are employing the same techniques they use to win customers, she said.
Manning argues that the application of AI technology to the hiring process can help remove unconscious bias. As to helping people understand why they didn't get a job, she's not convinced algorithmic transparency matters. "I think forever people have wondered why I did not get that job, and unless you were able to get feedback from the hiring manager, it's not something the technology necessarily plays a role in," she said.
Hu likewise isn't convinced algorithmic transparency would be that beneficial, noting that algorithmic discrimination is something HR should be looking for. Even if you know how the code works, he said, "recruiters could make unfair queries." He added, "Once candidates know how [a hiring algorithm] works, they're going to try to game it."
That raises a question: If transparency invites abuse from potential employees, does opacity invite abuse from employers? ®
Sponsored: Becoming a Pragmatic Security Leader