ECONOMY & WORK
MONEY 101
NEWS
PERSONAL FINANCE
NET WORTH
About Us Contact Us Privacy Policy Terms of Use DMCA Opt-out of personalized ads
© Copyright 2023 Market Realist. Market Realist is a registered trademark. All Rights Reserved. People may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.
MARKETREALIST.COM / NEWS

NYC Companies Using AI Hiring Software Have to Prove the Process is Free of Racism or Sexism

Around 83% of employers, including 99% of Fortune 500 companies, use some form of automated tool as part of their hiring process.
PUBLISHED JUL 12, 2023
Cover Image Source: Pexels | Alex Knight
Cover Image Source: Pexels | Alex Knight

Companies in New York City that use Artificial Intelligence in the form of apps or software to hire new talent have to prove that their process is free of sexism and racism. A new law is in place that is believed to be the first of its kind and it will ensure that AEDT—the hiring software these companies rely on—goes through an audit by a third-party company to prove that the process is free of racism as well as sexism. Companies that use AI hiring software will have to publish the results of the audits. 

Image Source: ThisIsEngineering/Pexels
Image Source: ThisIsEngineering/Pexels

Companies have been investing more and more in AI, especially in their hiring processes. According to the Equal Employment Opportunity Commission's chair Charlotte Burrows, around 83% of employers, including 99% of Fortune 500 companies, use some form of automated tool as part of their hiring process, per a NPR reportThe U.S. Equal Employment Opportunity Commission issued some guidelines around the new technologies used to hire new employees.

The organization found that scanners that prioritized keywords, chatbots that sort the applicants based on pre-written requirements, and software that analyze the applicant's facial expression may perpetuate bias. A person with speech impairment may automatically be eliminated before he can prove himself in other areas, or a chatbot can reject people for having gaps in their resumes without even giving them the chance to provide an explanation.

Julia Stoyanovich who is a professor at New York University said, "First of all, I’m really glad the law is on the books, that there are rules now and we’re going to start enforcing them." Another researcher Metcalf, specializing in AI for Data and Society, said, "There are quite a few employment law firms in New York that are advising their clients that they don’t have to comply, given the letter of the law, even though the spirit of the law would seem to apply to them." He also added that improving the overall system for people can help with this problem to quite an extent. 

Local Law 144 will bar employers from using automated AEDT for screening candidates unless it had been passed through an audit. The law also states that the employer needs to send the candidate a notice informing them that AEDT has been used.

This law, enforced by the New York Department of Consumer and Worker Protection helps eliminate human biases in the recruiting process. The official website of the department states that "at a minimum, an independent auditor’s evaluation must include calculations of selection or scoring rates and the impact ratio across sex categories, race/ethnicity categories, and intersectional categories."

If any company is found to be violating the new laws will face a penalty of $500 but that is just for the first violation. A second-time violation will result in a fine of up to $1,500 and this amount will be repeated for each subsequent violation.



 

Hiring processes can be time-consuming and many companies are increasingly turning to AI to make this whole procedure time efficient. However, determining which candidate qualifies for the job role is not straightforward, and relying on AI is not always an option.

Lack of human touch can be off-putting: Screening interviews by software often lack the human touch which triggers anxiety for many. This situation can result in poor results.

AI can adopt human biases: There are many instances where it has been seen that AI can learn from human bias. Hence, there are huge chances that it can repeat human behaviors.

Missing instinct: AI can replace simple jobs, but where there's a question of gut instinct AI can fall short in delivering.

Doesn't consider soft skills: People are not hired based on their resumes but rather on their personalities. 

Relies on humans: AI will always rely on human direction to carry out tasks.

POPULAR ON MARKET REALIST
MORE ON MARKET REALIST