This time, New York City, not the state.
You see NYC passed a law, New York City Local Law 144, that requires firms using algorithms in hiring and promotions have these algorithms audited for bigotry.
This law places an affirmative responsibility on the employer or landlord to certify that these bits of code do not discriminate:
After months of delays, New York City today began enforcing a law that requires employers using algorithms to recruit, hire or promote employees to submit those algorithms for an independent audit — and make the results public. The first of its kind in the country, the legislation — New York City Local Law 144 — also mandates that companies using these types of algorithms make disclosures to employees or job candidates.
At a minimum, the reports companies must make public have to list the algorithms they’re using as well an an “average score” candidates of different races, ethnicities and genders are likely to receive from the said algorithms — in the form of a score, classification or recommendation. It must also list the algorithms’ “impact ratios,” which the law defines as the average algorithm-given score of all people in a specific category (e.g. Black male candidates) divided by the average score of people in the highest-scoring category.
Companies found not to be in compliance will face penalties of $375 for a first violation, $1,350 for a second violation and $1,500 for a third and any subsequent violations. Each day a company uses an algorithm in noncompliance with the law, it’ll constitute a separate violation — as will failure to provide sufficient disclosure.
Importantly, the scope of Local Law 144, which was approved by the City Council and will be enforced by the NYC Department of Consumer and Worker Protection, extends beyond NYC-based workers. As long as a person’s performing or applying for a job in the city, they’re eligible for protections under the new law.
………
One needn’t look far for evidence of bias seeping into hiring algorithms. Amazon scrapped a recruiting engine in 2018 after it was found to discriminate against women candidates. And a 2019 academic study showed AI-enabled anti-Black bias in recruiting.
Elsewhere, algorithms have been found to assign job candidates different scores based on criteria like whether they wear glasses or a headscarf; penalize applicants for having a Black-sounding name, mentioning a women’s college, or submitting their résumé using certain file types; and disadvantage people who have a physical disability that limits their ability to interact with a keyboard.
As I have noted in a number of occasions, one of the unspoken selling points of algorithms in employment, or rentals, or ads placed on the criminal enterprise formerly known as Facebook™ is that you can discriminate without the possibility of any meaningful consequences.
For many employers, or landlords, this is seen as a feature, not a bug.
0 comments :
Post a Comment