Talent acquisition can be stressful for HR managers…especially when a job position needs to be filled quickly. Automated and AI employment decision tools such as resume scanners, employee monitoring software, virtual assistants and job-fit algorithms can certainly speed the candidate-assessing process…but at what cost?

Lawmakers and regulators at the federal, state and local levels are increasingly scrutinizing whether AI hiring tools unintentionally perpetuate bias. Yes, they streamline the hiring processes, but they may pose a compliance risk if they discriminate against certain candidates.

States Are Taking Action Against Using AI in Employment Decisions

In April 2023, New York City officially banned employers from using AI in employment decisions, unless they take affirmative steps to mitigate potential bias.

This includes conducting a bias audit of the AI tool, making the tool publicly available and notifying job candidates about its use. Candidates or employees must also be allowed to request alternative evaluation processes if desired.

California is looking to expand existing discrimination and employment laws that are related to liability risks for employers that leverage automated hiring tools, and Illinois and Maryland have enacted laws regulating them as well.

And in Colorado, Vermont and Washington task forces have been created to study AI and the risks it poses to an organizations DEI (diversity, equity and inclusion).

…And so is the U.S. Government.

On the federal level, the Equal Employment Opportunity Commission (EEOC) has issued warnings that using AI in employment decisions hiring tools could violate disability protections under the Americans with Disability Act, while the Federal Trade Commission has published reports expressing concerns about the use of automation in hiring without human review.

Meanwhile, the Department of Commerce has appointed 27 members to the National Advisory Committee to oversee AI. And the White House has released a blueprint for an AI Bill of Rights, which includes provisions related to employment decisions using AI tools.

What should you be doing?

When it comes to using AI in hiring, there are a few things to keep in mind. The definition of AI can be broad and unclear. It’s important to understand what constitutes AI use in an automated hiring tool, and your company’s legal counsel can help with this assessment.

When using automated hiring decision tools, consider data retention and work with third-party vendors to ensure compliance with any new laws or regulations.

And of course, be prepared to address any implementation issues that may arise during the hiring process. Providing transparency and accountability in your decision-making will go a long way in fostering trust.

Need more advice on hiring practices, compliance and other HR-related issues…and want to speak to a human? Contact Accu Data today. We specialize in personal relationships and are here to answer any questions you have regarding the ever-evolving world of Human Capital Management (HCM).

On the federal level, the Equal Employment Opportunity Commission (EEOC) has issued warnings that using as AI hiring tools in employment decisions could violate disability protections under the Americans with Disability Act, while the Federal Trade Commission has published reports expressing concerns about the use of automation in hiring without human review.

Meanwhile, the Department of Commerce has appointed 27 members to the National Advisory Committee to oversee AI. And the White House has released a blueprint for an AI Bill of Rights, which includes provisions related to employment decisions using AI tools.