Natural Intelligence: NIST Releases Draft Guidelines for Government Contractor Artificial Intelligence Disclosures
Client Alert | 3 min read | 08.28.24
On August 21, 2024, the National Institute of Standards and Technology (NIST) released the Second Public Draft of Digital Identity Guidelines (hereinafter, “Draft Guidelines”) for final review. The Draft Guidelines introduce potentially notable requirements for government contractors using artificial intelligence (AI) systems. Among the most significant draft requirements are those related to the disclosure and transparency of AI and machine learning (ML). By doing so, NIST underscores its commitment to fostering secure, trustworthy, and transparent AI, while also addressing broader implications of bias and accountability. For government contractors, the Draft Guidelines are not just a set of recommendations but a blueprint for future AI standards and regulations.
In identifying concerns for digital identity risk management, NIST focuses on three main concerns: identity proofing, authentication, and federation level. Each of these “can result in the wrong subject successfully accessing an online service, system, or data.” See Draft Guidelines, Section 3. The Draft Guidelines note that AI and ML are used in identity systems for multiple purposes (from biometrics to chatbots) and that potential applications are extensive, but that AI and ML also introduce distinct risks, such as disparate outcomes, biased outputs, and the exacerbation of existing inequities. See Draft Guidelines, Section 3.8.
As a result, Section 3.8 of the Draft Guidelines has been updated to require that, in any identity system:
- All uses of AI and ML must be documented and communicated to organizations relying on these systems, credential service providers (CSPs), identity providers (IdPs), or verifiers using AI and ML must disclose this to all responsible persons making access decisions based on these systems.
- Organizations using AI and ML must provide information to entities using their technology, including methods and techniques for training models, descriptions of training data sets, frequency of model updates, and testing results.
- Organizations using AI and ML systems must implement the NIST AI Risk Management Framework to evaluate risks and must consult SP1270 for managing bias in AI.
In other words, NIST’s Draft Guidelines update the call for detailed disclosures that explain how AI systems operate, the data they rely on, and the algorithms that drive their decisions. Clear disclosures will help government clients understand how AI systems work, which can advance decision-making in areas where AI decisions have significant consequences, such as healthcare, law enforcement, and public policy. At the same time, accountability and ethical considerations help foster trust with AI-solutions.
As AI continues to revolutionize various industries, its integration into government projects brings opportunities and challenges. NIST’s role in developing and promoting standards that ensure security, privacy, transparency, and reliability with new technology will be crucial in shaping how AI systems are designed, implemented, and disclosed. Government contractors who embrace the Draft Guidelines may be better positioned to lead in this evolving landscape, shaping new requirements and delivering AI solutions aligned to the highest standards.
NIST is seeking public comments on the Draft Guidelines through October 7, 2024. Stakeholders should engage with NIST through public comments now, as well as begin to plan for adherence to these guidelines. Taking steps to weigh in on the Draft Guidelines as well as prepare for implementation should they go into effect, will be essential for anticipating final guidelines and ensuring compliance.
Beginning to reevaluate contract provisions and development of AI governance programs, in line with the Draft Guidelines, is crucial for preparation. Government contractors need to be in a position to seamlessly comply with requirements already placed on government agencies through President Biden’s Executive Order on AI and OMB Guidance that will necessarily be passed down to them.
By navigating the legal landscape, Crowell & Moring LLP can help clients understand the unique legal implications of NIST, assess legal risks associated with AI disclosures, and identify areas where the client may be vulnerable to potential litigation. Crowell can also advise on where the Draft Guidelines intersect with existing statutes and regulations, such as the Federal Acquisition Regulation (FAR) or False Claims Act (FCA), conduct trainings, and help develop new strategies to mitigate risk from a comprehensive legal perspective.
As NIST begins to collect public comments on the Draft Guidelines, Crowell will continue to monitor legal and policy developments regulating the use of artificial intelligence. We are prepared to help clients submit comments and engage with regulators, as well as consider their potential next steps.
Contacts
Insights
Client Alert | 3 min read | 12.10.24
Fast Lane to the Future: FCC Greenlights Smarter, Safer Cars
The Federal Communications Commission (FCC) has recently issued a second report and order to modernize vehicle communication technology by transitioning to Cellular-Vehicle-to-Everything (C-V2X) systems within the 5.9 GHz spectrum band. This initiative is part of a broader effort to advance Intelligent Transportation Systems (ITS) in the U.S., enhancing road safety and traffic efficiency. While we previously reported on the frustrations with the long time it took to finalize rules concerning C-V2X technology, this almost-final version of the rule has stirred excitement in the industry as companies can start to accelerate development, now that they know the rules they must comply with.
Client Alert | 6 min read | 12.09.24
Eleven States Sue Asset Managers Alleging ESG Conspiracy to Restrict Coal Production
Client Alert | 3 min read | 12.09.24
New York Department of Labor Issues Guidance Regarding Paid Prenatal Leave, Taking Effect January 1
Client Alert | 4 min read | 12.06.24