The Trump Executive Order on AI Regulation: Key Takeaways for the Health Care Industry

On the evening of December 11, 2025, President Trump signed the Ensuring A National Policy Framework for Artificial Intelligence, an Executive Order designed to further his Administration’s effort to preempt state regulation of artificial intelligence and challenge state laws in the space through enforcement actions by a multiagency review task force operating under the Attorney General. The stated purpose of the Executive Order is to encourage and advance the use of artificial intelligence by fostering a legal and regulatory environment that promotes innovation, mitigating the impact of state efforts to individually regulate artificial intelligence, and avoiding a patchwork of state law that creates burdensome compliance requirements for the industry. The Executive Order also aims to prevent laws or regulations addressing the influence of ideology on model design or model outputs.
The Executive Order was an anticipated step in the Administration’s effort to regulate artificial intelligence, as prior efforts to establish federal preemption of state laws through federal legislation failed in recent months, even as a draft of this Executive Order circulated. The Administration was unsuccessful in its attempt to include preemption provisions in both H.R. 1 (the “One Big Beautiful Bill Act”) and the recent National Defense Act (“NDAA”) conference committee report. Both the legislative proposals and draft Executive Order drew criticism from stakeholders across the political spectrum, as well as from state elected officials.
With no consensus on a national framework for artificial intelligence regulation, the states have taken the lead, creating the fragmented landscape that this Executive Order seeks to address. The Executive Order also marks a new approach in the Administration’s strategy, placing greater reliance upon the interstate commerce clause as a means for federal preemption of state law in this rapidly developing area.
KEY PROVISIONS OF THE EXECUTIVE ORDER
The Executive Order does not create new federal law or specific regulations that would supersede state efforts to prohibit or otherwise place guardrails on development or deployment of tools that rely upon artificial intelligence. Instead, it directs various agencies, such as the Department of Justice, Federal Trade Commission, and Federal Communications Commission, to work within their existing authority to bring challenges against state efforts deemed to conflict with the intent and goals of the Executive Orders, and broader goal of stablishing a uniform national regulatory framework. Additionally, the Executive Order purposes withholding certain federal funds from states who have passed laws that conflict with the policies of the order.
AI Litigation Task Force
The Executive Order requires creation of an “AI Litigation Task Force” within the Department of Justice. The sole responsibility of this Task Force is to challenge state laws the Attorney General views as inconsistent with the Administration’s perspectives on regulation of emerging technologies and other areas of policy. The task force is authorized to bring or support litigation relying on use of the Dormant Commerce Clause, statutory preemption by existing federal regulations, and targeting what the Administration characterizes as the most onerous state measures.
Review of Existing State AI laws
The Executive Order directs the Secretary of Commerce, in consultation with the President’s special advisor for AI and Crypto and other senior White House officials, to publish within 90 days an evaluation of existing state laws that have been passed which address use of artificial intelligence in both general use and specific contexts, such as health care. That review must at minimum (1) identify “onerous” laws that require AI systems to alter “truthful outputs” or compel disclosures the administration believes infringe the First Amendment, and (2) flag specific statutes to be referred to the AI Litigation Task Force for potential challenge, while also optionally identifying state laws deemed innovation‑friendly.
Future Legislation
There have been Republican efforts this year to pass a moratorium on state AI laws in both the One Big Beautiful Bill Act and the National Defense Authorization Act (NDAA). Both efforts were ultimately scrapped from final passage with some Republicans rejecting the provision. Preemption legislation would not pass the Senate’s sixty vote threshold, leaving supporters to focus on passage in budget reconciliation which, although unlikely, would only require 50 votes.
The Executive Order also urges Congress to work with members of the Administration to draft legislation that preempts state law in furtherance of efforts to establish a uniform national approach to regulating artificial intelligence. Recently, Congress has been active in this area, as evidenced by recent hearings by the Senate Judiciary Committee and House Energy and Commerce Committee that explored mental health care use cases and efforts to ensure consumer safety. Still, there has been no significant movement of legislation in either chamber. Notably, the Executive Order’s call for legislative solutions explicitly exempts child safety, data center infrastructure, state government use of artificial intelligence, as well as “other topics as should be determined” by the Administration.
Restriction of Federal Funding
The Executive Order directs federal agencies to use distribution or eligibility of certain federal broadband and discretionary grant funds as a mechanism to ensure state compliance and prevent further state regulation. Within 90 days, the Secretary of Commerce must issue a Policy Notice outlining when states may receive remaining funds under the Broadband Equity, Access, and Deployment (“BEAD”) Program. The notice must declare that states with onerous AI laws are ineligible for non‑deployment BEAD funds to the extent permitted by law. It must also explain that inconsistent state AI regulations could disrupt broadband deployments and AI innovation dependent on high‑speed networks. BEAD funding has been an area relied upon by the states to support initiatives to expand broadband coverage, in part to further access to care through telemedicine.
Other agencies, such as those under the Department of Health and Human Services, are directed to review their discretionary grant programs and, in consultation with the Special Advisor for AI and Crypto, determine whether those grants could be conditioned on states.
KEY IMPACTS AND STRATEGIC CONSIDERATIONS
An executive order may preempt state law only when it implements provisions of federal law. There is no federal law addressing the regulation of artificial intelligence cited in the Executive Order. Accordingly, this Executive Order does not preempt state law despite its characterization.
Rather, it serves as a broad directive instructing federal agencies to take actions consistent with their existing statutory authority and deliver on the President’s request for evaluation of state laws within 90 days. The form and function of these directives remain to be seen, but there are several possible outcomes.
If the newly formed AI Litigation Taskforce initiates legal challenges to state laws, such efforts will likely encounter significant resistance. States are expected to argue that attempts to override existing or future legislation violate the Supremacy Clause or infringe upon state sovereignty and police powers under the Tenth Amendment. Notably, a bipartisan coalition of state attorneys general expressed concerns about federal preemption in November and signaled their intent to safeguard state authority over traditional areas of regulation—such as health care, consumer protection, and privacy—where AI use cases are emerging.
Last year, the states enacted more than 125 laws addressing development and use of artificial intelligence. State laws that seek to regulate artificial intelligence broadly through general provisions intended to address improper use of data or algorithmic bias, will face the greatest initial scrutiny. The Executive Order specifically identifies a Colorado law that requires testing and disclosure of AI that makes consequential decisions about people’s lives and seeks to prevent discrimination as the type of laws that the Secretary of Commerce and the newly created AI Litigation Taskforce should review. California, which has been most active state in passing legislation, may see challenges to laws that require transparency, prohibit discriminatory algorithmic pricing, or limit defenses for developers of artificial intelligence should use of their model create harm.
State laws such as Illinois’ prohibition on mental health chat bots are less likely to be challenged as such a challenge may be viewed unfavorably in popular opinion and would invite a prolonged legal battle over the expansion of the Dormant Commerce Clause into areas of traditional state regulation over matters of health care. This may result in states policymakers turning to laws and regulations that are narrowly tailored for specific health care use cases, such as the use and role of artificial intelligence in the scope of licensed professionals.
The Executive Order may also impact the use of discretionary funding to advance the artificial intelligence strategy recently announced by the Department of Health and Human Services. Released on December 4, the strategy promotes an outcomes-first approach to integrating AI into care delivery and public health infrastructure to improve health at both individual and population levels. Among the programs most immediately affected may be grants under the Rural Health Initiative, expected to be awarded by year’s end. States that scale back efforts to regulate AI in an effort to avoid conflict with the Executive Order’s policy objectives may receive more favorable consideration. As noted above, restrictions on broadband funding can potentially impact efforts to expand telemedicine access. Leveraging discretionary funding to advance federal priorities is a well-established feature of cooperative federalism and would likely withstand legal challenge. However, the perceived impact on state sovereignty in matters of health care regulation could generate political and public scrutiny that ultimately limits the extent to which funding is used to reward or punish state activity.
The Executive Order is also unlikely to significantly affect litigation seeking damages from AI developers under product liability or tort theories. Nothing in the Executive Order prevents lawsuits—such as the case brought by the family of Matthew Raine against OpenAI regarding mental health applications of foundational LLMs—from proceeding in state courts. However, the outcomes of these cases, particularly if future litigation arises over federal preemption or if the Executive Order discourages additional state legislative activity, may ultimately shape the legal landscape more than the Order itself. For instance, the lawsuits may influence product design and acceptable use policies by prompting greater incorporation of health law principles related to data use and public safety into future AI tools, including those used in health care.
Professional
Related Capabilities
HLB’s attorneys and government relations professionals will continue to monitor developments and provide timely updates on the Executive Order’s implications for health care clients. In the meantime, regulators—including State Attorneys’ General—are expected to pursue oversight of hospitals and other facilities using artificial intelligence tools, ensuring they have robust governance practices that p improper or discriminatory use in the context of care, reimbursement, and insurance. Health care organizations and providers—especially those operating in jurisdictions with enacted or pending AI-related legislation—should closely track emerging requirements on privacy, disclosure, and vendor relationships now to ensure compliance with rapidly evolving legal standards and avoid enforcement risk. For more information, please contact Eric Fish, Alicia Macklin, Monica Massaro, Martha Cramer, or your regular Hooper, Lundy & Bookman contact.

