you are viewing a single comment's thread.

view the rest of the comments →

[–]WickedWitchOfTheWest 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (0 children)

In NYC, companies will have to prove their AI hiring software isn't sexist or racist: AI-infused hiring programs have drawn scrutiny, most notably over whether they end up exhibiting biases based on the data they’re trained on.

A new law, which takes effect Wednesday, is believed to be the first of its kind in the world. Under New York’s new rule, hiring software that relies on machine learning or artificial intelligence to help employers choose preferred candidates or weed out bad ones — called an automatic employment decision tool, or AEDT — must pass an audit by a third-party company to show it’s free of racist or sexist bias.

Companies that run AI hiring software must also publish those results. Businesses that use third-party AEDT software can no longer legally use such programs if they haven’t been audited.

Companies are increasingly using automated tools in their hiring processes. Cathy O’Neil, the CEO of Orcaa, a consulting firm that has been running audits of hiring tools for companies that want to be in good standing with New York’s new law, said the rise in tools that automatically judge job candidates has become necessary because job seekers are also using tools that send out huge numbers of applications.

[...]

Jake Metcalf, a researcher specializing in AI for Data & Society, a nonprofit group that studies the effects of technology on society, said the wording of the law — it defines AEDT as technology that will “substantially assist or replace discretionary decision making” — has led lawyers that advise large companies not to take it seriously.

“There are quite a few employment law firms in New York that are advising their clients that they don’t have to comply, given the letter of the law, even though the spirit of the law would seem to apply to them,” Metcalf said.