Executive Order Tries to Thwart “Onerous” AI State Regulation, Calls for National Framework
What You Need to Know
Key takeaway #1
The White House has issued a much-anticipated Executive Order that seeks to restrain state AI regulation by threatening states with lawsuits and the withholding of funds and calls for a national policy framework on AI.
Key takeaway #2
The EO faces significant implementation challenges and legal hurdles, likely complicating an elaborate regulatory environment, which includes the potential impact of international AI regulations that are equally in a state of flux in light of recent EU proposals.
Key takeaway #3
Impacted companies should monitor closely agency initiatives to implement this EO, potential push back by the states, and further efforts by the Administration to draft and advocate for what it terms a “minimally burdensome national policy framework for AI.”
Client Alert | 7 min read | 12.17.25
On December 11, 2025, President Trump signed a much-anticipated Executive Order that seeks to forestall state regulation of artificial intelligence (AI) by threatening federal lawsuits and the withholding of some federal funds and calls for a national policy framework on AI. The Executive Order, Ensuring a National Policy Framework for Artificial Intelligence (EO), declares it the policy of the administration “to sustain and enhance the United States’ global AI dominance through a minimally burdensome national policy framework for AI.”
The president signed the EO after Congress chose in recent weeks not to include a provision to preempt or otherwise attempt to block state-based AI regulation in the National Defense Authorization Act for Fiscal Year 2026 (NDAA). A draft of the EO, about which we wrote previously, circulated publicly on November 19 and was similar in most respects to the final version.
The EO does not have the force of law nor is it self-executing. Rather, it requires further action by federal agencies to implement and will likely trigger political and legal challenges by impacted states. Furthermore, state laws will likely remain in effect even if the federal government brings suits against them.
The surest route to a national policy framework is through federal legislation, which this EO endorses but does not create. Absent that legislation, the contested state and federal landscape will continue to impose significant shifting regulatory burdens on companies developing, integrating, and deploying AI systems.
Directing Agencies to Act Against State AI Laws
Specifically, the EO:
- Directs the U.S. Department of Justice within 30 days to establish an AI Litigation Task Force (Task Force) to sue states for enacting laws that, in the Administration’s view, unconstitutionally regulate interstate commerce or are preempted by existing federal regulations.
- Requires the Secretary of Commerce within 90 days to publish an evaluation of state AI laws that are “onerous,” conflict with a “minimally burdensome” national policy, as well as laws that should be referred to the Task Force for potential action. The evaluation should, at a minimum, identify laws that may require disclosures or reporting contrary to the First Amendment of the U.S. Constitution. This evaluation may also identify state laws that “promote AI innovation.”
- Mandates the Secretary of Commerce within 90 days to issue a Policy Notice specifying when states may be eligible for Broadband Equity Access and Deployment (BEAD) Program funding, noting that states with “onerous” AI laws will be ineligible for such funds “to the maximum extent allowed by Federal law.”
- Directs executive departments and agencies to assess discretionary grant programs to determine if agencies “may condition” such state grants depending on whether the states enact or enforce “onerous” AI laws.
- Directs the Federal Communications Commission within 90 days to begin a process to determine whether to adopt a federal reporting standard for AI models that would purport to preempt state laws.
- Directs the Federal Trade Commission within 90 days to issue a policy statement on its power to prosecute unfair and deceptive trade practices to challenge state laws that “require alterations to the truthful outputs of AI models.”
- Instructs White House officials to draft a legislative recommendation for a uniform “minimally burdensome” federal regulatory framework for AI that would preempt conflicting state laws. The legislative recommendation should not propose preempting state laws relating to child safety protections, AI computer and data center infrastructure, state government procurement and use of AI, and “other topics” to be determined.
The EO particularly criticizes Colorado’s AI Act for allegedly banning “algorithmic discrimination” in a way that may compel AI models to produce “false results.” (The draft EO also deprecated California’s recent AI transparency law for its allegedly burdensome reporting requirements; the signed EO is silent on that law.)
Curtailing State Lawmaking
The EO attempts by executive action what Congress has failed to do through legislation — preempt or place a “moratorium” on state AI laws, which have proliferated in the past three years. State lawmakers have introduced hundreds of bills and adopted scores of them in state capitals across the country to protect consumers and children, limit AI use in certain circumstances, and impose transparency and reporting requirements on some AI developers.
In July 2025, the White House released America’s AI Action Plan, an extensive policy roadmap that exhorted the federal government to seek AI “dominance” by minimizing most regulations. The plan and related presidential orders led to a White House-issued request for information on federal, but not state, AI laws and policies that “unnecessarily hinder” AI development. The Commerce Department also requested input on a government-run American AI export program. And the Office of Management and Budget (OMB) recently released guidelines imposing certain transparency obligations on contractors to bar the federal government from procuring AI systems that incorporate “ideological biases or social agendas.”
Attempts to include a preemption provision or a quasi-moratorium in the NDAA sputtered in the face of Democratic and some Republican opposition. Moreover, earlier attempts at legislation to impede state AI regulations have drawn opposition from many Republicans in Congress who support at least some state-based AI regulation.
Impact
While this EO directs specific actions by federal agencies, its impact remains to be seen. For example, nowhere does the EO define what an “onerous” regulation is, how to determine what laws are “minimally burdensome,” and how agencies can determine what are “truthful outputs of AI.” Moreover, attempts to enforce these provisions — by bringing suits for violations of interstate commerce, withholding broadband funding or other grants, investigating state laws as unfair and deceptive, or preempting state laws based on an agency policy — will likely draw vigorous defenses or lawsuits from state attorneys general and impacted parties. Prominent Democrats and Republicans both criticized the EO as too closely aligned with industry and contrary to federalist principles, suggesting it will face stiff opposition.
Finally, federal preemption by executive decree, absent a clear congressional delegation of powers, is not a generally accepted practice under the U.S. Constitution. Courts are usually “even more reluctant” to find state laws preempted based on mere regulations as opposed to statutes, and the U.S. Supreme Court has held recently that the anti-commandeering principles of the Tenth Amendment bar the federal government from prohibiting certain state laws regulating private conduct.
Even with this EO now signed, state laws will likely remain in effect while the federal agencies proceed to implement the EO and even during the pendency of any federal government suits against states.
Indeed, David Sacks, the White House Special Advisor for AI and Crypto and a supporter of the President’s push, wrote that the EO is not a national framework itself, “or an amnesty or moratorium, but rather a statement of principles and a set of tools” for the White House to resist state laws it finds “onerous and excessive.”
Thus, the evolving, complex, and cross-cutting federal and state regulatory currents will continue, and companies that develop, integrate, and deploy AI systems should remain attuned to their shifting obligations.
International Complications
The EO will have an impact not just in the United States but also internationally, given the EO’s divergence from many laws outside the United States, which directly impact those developing AI and looking to deploy it. For example, in many cases, companies need to consider the EU AI Act, Data Act, and GDPR. Those laws, too, are facing upheaval; in November, the European Commission unveiled potential changes to its digital laws to streamline rules on AI, cybersecurity, and data. Thus, we are looking at a more complex, not less complicated, national and international compliance picture.
***
Crowell & Moring will continue to monitor these fastmoving AI policy and legislative developments at the state, federal, and international level. For further information, please contact our team.
Contacts

Partner, Crowell Global Advisors Senior Director
- Washington, D.C.
- D | +1.202.624.2698
- Washington, D.C. (CGA)
- D | +1 202.624.2500
Insights
Client Alert | 7 min read | 12.17.25
After hosting a series of workshops and issuing multiple rounds of materials, including enforcement notices, checklists, templates, and other guidance, the California Air Resources Board (CARB) has proposed regulations to implement the Climate Corporate Data Accountability Act (SB 253) and the Climate-Related Financial Risk Act (SB 261) (both as amended by SB 219), which require large U.S.-based businesses operating in California to disclose greenhouse gas (GHG) emissions and climate-related risks. CARB also published a Notice of Public Hearing and an Initial Statement of Reasons along with the proposed regulations. While CARB’s final rules were statutorily required to be promulgated by July 1, 2025, these are still just proposals. CARB’s proposed rules largely track earlier guidance regarding how CARB intends to define compliance obligations, exemptions, and key deadlines, and establish fee programs to fund regulatory operations.
Client Alert | 1 min read | 12.17.25
Client Alert | 4 min read | 12.17.25
The new EU Bioeconomy Strategy: a regulatory framework in transition
Client Alert | 2 min read | 12.16.25



