As with private companies and nonprofit organizations, government agencies—both big and small—are adopting automation in Human Resources (HR) decision-making processes. For example, federal and local government agencies use algorithms to handle leave requests, issue certifications, and run background investigations. Algorithms, however, don’t eliminate discrimination and other inequities in hiring processes; rather, they can even exacerbate or mask them.
HR Avatar is one example of a system that government agencies use to incorporate automation in HR decisions. The system uses AI-driven voice and personality analysis of tests and interviews to provide a quantitative evaluation of applicants for over 200 different positions, which a range of government agencies use to compare, screen and select applicants. Federal and local agencies use HR Avatar for these pre-employment assessments: the Department of Homeland Security, Transportation Security Administration, Federal Aviation Administration and Department of Commerce are listed as clients on HR Avatar’s website.
Recently, the Pottawattamie Sheriff Office, in Iowa, listed a call for applicants to a detention officer position and explained that this system would be used in the hiring process. Applicants first participate in HR Avatar’s Correctional Officer Pre-Employment Assessment and Virtual Interview, which provides the county sheriff office with an overall score and summary of each candidate, competency scores in various areas important to the position and other evaluations. The office then uses these evaluations in the selection of applicants to invite to continue in the hiring process for this position. In effect, HR Avatar’s standardized, automated evaluation of soft skills helps the county make hiring decisions about a position that requires fluency in social skills, such as the ability to retain composure when dealing with violent or hostile individuals.
Although automation can improve the efficiency and success of hiring processes, and artificial intelligence is increasingly enticing as companies continue to shrink and outsource HR departments, researchers have highlighted the challenges in using data science for HR tasks. Automated processes have the potential to reflect existing biases, proactively shape applicant pools and otherwise perpetuate discrimination in hiring practices. While there are many employment laws that address discrimination in hiring practices, the issue of identifying and mitigating discrimination in employment screening algorithms raises new policy concerns. To further investigate government use of HR Avatar, journalists can issue FOIA requests to agencies that use the system. While investigating HR Avatar’s assessment system may be difficult since HR Avatar is a private company, journalists could review public documents. To further investigate government use of HR algorithms in general, journalists can research state and federal laws (and proposed legislation) about automated employment practices. Furthermore, other leads in the Algorithm Tips database point to automation in HR decision-making processes, including applicant selection at a city fire department, the Department of Justice, and the U.S. Armed Forces.