Is your company’s artificial intelligence showing?

Sunira Chaudhri

Sunira Chaudhri

Toronto Employment Lawyer

On July 5, 2023, New York City will begin to enforce Local 144, a law that will regulate the use of artificial intelligence tools used by employers with respect to the hiring and promotion of employees.

If New York employers are using artificial intelligence to sift through employee resumes and make employee promotion decisions, they are about to be the target of a new law.

 

On July 5, 2023, New York City will begin to enforce Local 144, a law that will regulate the use of artificial intelligence tools used by employers with respect to the hiring and promotion of employees.

 

It would likely surprise most readers to learn that large employers are already employing automated employment decision tools (AEDTs), powered by artificial intelligence, to synthesize employee data. Employers rely on these tools to help make hiring decisions for external candidates and to identify candidates for promotion, saving money on HR employees and recruiters.

 

By July 5, 2023, New York employers will have to disclose to the city if they actually use AEDTs in hiring and promotion, and if so, these tech-savvy employers will be required, by the new law, to hire an independent third party to conduct a bias audit. The results of any bias audit would then have to be published. Employers who use AEDTs will even have to provide notice to applicants and employees that the employer uses AEDTs in hiring and promotion decision-making. Once notified, employees or candidates could then request an accommodation or alternative selection process.

 

With the introduction of Local 144, the true potential of artificial intelligence in the workplace will never be known. And, perhaps, for good reason. Some AEDTs contain biases that can isolate strong candidates for arbitrary reasons. New York City’s strict regulation of these platforms raises ethical and legal questions about allowing technology to drive employment decisions.

 

It is no secret that I believe artificial intelligence has clear limitations in the workplace. While these high-powered tools may be able to synthesize data and save HR professionals scores of hours in reviewing applicant material, artificial intelligence platforms carry inherent, unseen biases that could favour some employees over others. AI, after all, is trained with human interaction and guidance, and its decision-making power can be skewed by the biases of its trainer.

 

For example, AEDTs may present biases to favour employees who are younger, men over women, employees with “marketable” names, and candidates that do not require medical or other accommodations.

 

Most jurisdictions including, New York and Ontario, have robust human rights protections that are meant to remove discriminatory hurdles and bias from application processes. Using AEDTs may introduce discriminatory hiring practices to workplaces that employers would have little power to curtail.

 

For example, employers in Ontario are not permitted to ask applicants their age at an interview or make hiring decisions based on potential accommodations an employee may need. If a worker is denied employment and or a promotion as a result of a discriminatory ground, the employer has acted illegally and the employee would be entitled to remedies under the Human Rights Code.

 

We have all read stories about unfair hiring practices at big corporations and stories of women and people of colour being unfairly treated with respect to promotions. Little attention, however, has been given to the way artificial intelligence may disrupt and or skew hiring and or promotion processes if implemented to run a company’s hiring initiatives.

 

As New York City is regulating away the power of artificial intelligence in the employment space, surely jurisdictions like Ontario and British Columbia are soon to follow.

 

To the extent that large employers are already mobilizing artificial intelligence to scan resumes and identify strong candidates, Canadian employers should be wary of the bias that these programs may have built into them, opening them up to legal liability that they are probably not even aware of.

 

Artificial intelligence tools are fast and powerful but they are most certainly impure.

 

Have a workplace issue? Maybe I can help! Email me at sunira@worklylaw.com and your question may be featured in a future column.

 

The content of this article is general information only and is not legal advice.

More In The News

Badly behaved employers are sure to be sanctioned in Court

Pohl managed 30 sales associates in his role as sales manager. He was in charge of hiring, firing, training and coaching his team. He created the weekly schedules, monitored sales and worked to improve the stores numbers. He had a lot of responsibility in his role and performed well for 28 years.

Read More

Gig worker rights are coming to Ontario

Invisible, too, are the rights of gig workers. Tech giants insist they are freelancing independent contractors, free to chart their own paths. Uber has resisted the classification of their workers as employees at all levels of court. These workers that ran a lot of our economy during the pandemic are vulnerable and unprotected.

Read More