Does Your HR Department Use AI for Hiring?
Here's what you should know. When algorithms influence who gets hired and promoted, there are legal and ethical considerations to keep in mind.
BY JENNIFER CONRAD, SENIOR WRITER@JENNIFERCONRAD
Artificial intelligence is increasingly playing a role in who gets hired and promoted.
John Verdi, senior vice president for policy at the Future of Privacy Forum, a Washington, DC-based think tank, says applicant-screening software is constantly evolving, and that many common software systems, including those used by small businesses, incorporate AI features.
The human resources software maker Workday uses AI to help identify applicants and spot trends for in-demand skills. LinkedIn uses recommendation algorithms to help recruiters to find similar candidates.
The new systems promise to save time and surface more qualified candidates, relying on anything from text-based systems that scan resumes for keywords to machine vision systems that purport to detect personality traits in video calls.
But there are legal and ethical constraints to keep in mind. The Federal Trade Commission has repeatedly warned U.S. business owners that they're responsible for complying with anti-discrimination laws when using automated systems. Last summer, New York City passed a law requiring businesses that use AI in hiring and promotion decisions to disclose the use of AI and conduct annual audits, and other jurisdictions are considering similar regulations.
Verdi says AI shouldn't fully replace humans in the hiring process, but there are steps companies should take to ensure they're using AI tools responsibly.
Have a plan in advance
Before you introduce a new system, think through what you hope it will accomplish, says Verdi. Do you want to sift through a large number of resumes faster? Or increase the diversity of your candidate pool? Knowing what you want can help you find the best system from a software vendor--and will allow you to measure the success of a system against those goals in the future.
When you speak to vendors, find out what sort of data protection policies are in place, including how passwords are protected, how information is encrypted in transit, and what personal information is stored. More than a dozen states including Delaware, California, and Texas have passed their own laws related to data collection or require that customers have an option to opt out.
Don't use "secret AI"
This fall, the Future of Privacy Forum worked with ADP, Indeed, LinkedIn, and Workday to develop a set of best practices for using AI in human resources decisions. One of the key recommendations was that companies should not secretly use AI to evaluate job seekers or employees, including chatbots that evaluate interactions and AI tools that score writing samples.
The report recommends always disclosing to candidates when AI is being used to make decisions that have "consequential impacts," such as deciding who gets hired, fired, or promoted. New York City's law requires it. Some systems have automatic disclosures built in that employers can use or modify, Verdi says. You should also have an alternate review system in place for people who request to be evaluated by a human.
Avoid tools that claim to read emotions
Certain systems attempt to judge a candidate's emotions or capabilities using biometric data such as facial expressions or manner of speech. "A lot of organizations look at these skeptically, often for good reason," says Verdi. The systems may be imprecise or introduce biases into the hiring process. HireVue, for example, developed a facial analysis system for judging candidates on video calls--and later pulled it amid backlash and bad press.
Verdi adds that there are some situations in which computerized assessments might be helpful, such as judging the focus of air-traffic controllers. In cases like that, the tools should be used in narrow ways "directly tied to the underlying skills that are at issue for the role," he says.
Do your own auditing
Depending on where you're based, you may be required to conduct regular audits of your system to ensure they're not producing biased results. Vendors are responsible for auditing their own systems, but Verdi suggests companies do internal audits as well, by looking at data on who gets hired or promoted before and after new systems are put in place.
"Start to look at the outcomes," says Verdi. "Is this tool helping us identify better candidates, more qualified candidates?"
Photo Credit: Getty Images.