NYC Considers Regulation of Artificial Intelligence in Hiring
Client Alert | 3 min read | 03.13.20
Artificial intelligence and predictive analytics tools are, without question, an area of rapidly growing interest among sophisticated employers seeking to streamline hiring processes and match the best qualified candidates with open roles. Now, the New York City Council has introduced legislation to specifically regulate the use of artificial intelligence in hiring. This proposed legislation comes on the heels of last summer’s SB S3971B, which New York’s Governor Andrew Cuomo signed in July 2019 to authorize the creation of a state commission to study and investigate how to regulate artificial intelligence, robotics and automation. If passed, New York’s new law would be effective on January 1, 2022. This proposed legislation may well be a model on which other cities and states build their own legislation in this arena.
The new law would require that any developer of an “automated decision tool” who wishes to sell that product in New York City:
- Be able to show that the tool was the subject of a “bias audit” conducted in the past year.
- Offer, at no additional cost, an annual bias audit and provide the results of the audit to the purchaser.
- Include a notice (aimed at the purchaser) stating that the tool is subject to the provisions of this law.
The law defines “automated decision tool” as “[a]ny system whose function is governed by statistical theory, or systems whose parameters are defined by such systems, including inferential methodologies, linear regression, neural networks, decision trees, random forests, and other learning algorithms, which automatically filters candidates or prospective candidates for hire or for any term, condition or privilege of employment in a way that establishes a preferred candidate or candidates.” This would include Pymetrics and similar products which aim to promote efficiency and reduce human-driven bias in hiring.
Bias Audit
As noted above, the law would require a “bias audit,” which is defined as “an impartial evaluation” of the tool “to assess its predicted compliance with the provisions of” the anti-discrimination provisions of the City’s Code. The City’s anti-discrimination provision prohibits employment practices which have a disparate impact based on any of its protected categories: “age, race, creed, color, national origin, gender, disability, marital status, partnership status, caregiver status, sexual and reproductive health decisions, sexual orientation, uniformed service or alienage or citizenship status.” The NYC law is silent as to how lawmakers would expect developers – or prospective users of these products – to perform the requisite audit as to disparate impact on this broad list of protected categories; employers, other than federal government contractors and some state/municipal government contractors, are currently under no legal obligation to collect applicant demographic data.
Notice to Candidates
Although the law would place requirements primarily on developers of automated decision tools, it also would require any employer that uses such a tool “to screen a candidate” for employment to provide a notice to candidates disclosing:
- That an automated employment decision tool required by the law to be audited for bias was used in connection with the candidate’s candidacy.
- The job qualifications or characteristics that the tool was used to assess in the candidate.
Employers must provide this notice within 30 days of the tools’ use. The law does not define “qualifications or characteristics” to provide what must be described in the notice. Vendors who have developed many of the predictive analytics tools available today do not, in the ordinary course, disclose information regarding their algorithms to prospective buyers. As a result, identifying the “job qualifications and characteristics” assessed by the tool may present challenges.
Takeaways for Employers
Regardless of whether this proposed legislation becomes law, the New York City Council’s efforts to regulate this space illustrates the various challenges of applying existing legal frameworks to new and evolving technologies. For more than four decades, employers (and their counsel) have relied on the Uniform Guidelines on Employee Selection Procedures, adopted by the Equal Employment Opportunity Commission, the Civil Service Commission, the Department of Labor, and the Department of Justice, to inform business’ use of tests and other selection procedures in hiring. As new tools and technologies proliferate, we expect to see more attempts by state and local governments to enact laws regulating – and perhaps limiting – the use of such tools. Likewise, we expect sophisticated employers to have questions about the implications of relying on these tools, including potential indemnification by vendors against discrimination-in-hiring claims and other potential risks. Employers should watch this space closely for developments in the coming months.
Contacts
Insights
Client Alert | 3 min read | 12.13.24
New FTC Telemarketing Sales Rule Amendments
The Federal Trade Commission (“FTC”) recently announced that it approved final amendments to its Telemarketing Sales Rule (“TSR”), broadening the rule’s coverage to inbound calls for technical support (“Tech Support”) services. For example, if a Tech Support company presents a pop-up alert (such as one that claims consumers’ computers or other devices are infected with malware or other problems) or uses a direct mail solicitation to induce consumers to call about Tech Support services, that conduct would violate the amended TSR.
Client Alert | 3 min read | 12.10.24
Fast Lane to the Future: FCC Greenlights Smarter, Safer Cars
Client Alert | 6 min read | 12.09.24
Eleven States Sue Asset Managers Alleging ESG Conspiracy to Restrict Coal Production
Client Alert | 3 min read | 12.09.24
New York Department of Labor Issues Guidance Regarding Paid Prenatal Leave, Taking Effect January 1