President Donald Trump signed an executive order Thursday night directing federal agencies to challenge state AI laws, arguing that startups need relief from a “patchwork” of rules. Meanwhile, legal experts and startup companies say the order could prolong uncertainty, spark legal battles and leave young companies navigating changing state requirements while waiting to see if Congress can agree on a single national framework.
The order, titled “Securing a National Policy Framework for Artificial Intelligence,” directs the Department of Justice to establish a task force within 30 days to challenge certain state laws on the grounds that AI is interstate commerce and should be regulated by the federal government. This would give the Commerce Department 90 days to compile a list of state AI laws that are “onerous,” an assessment that could affect each state’s eligibility for federal funds, including broadband subsidies.
The order also requires the Federal Trade Commission and Federal Communications Commission to consider federal standards that could preempt state rules, and directs the administration to work with Congress on a uniform AI law.
The order comes as efforts in Congress to suspend state regulations have stalled and there are widespread calls to rein in state-by-state AI rules. Lawmakers from both parties argued that without federal standards, blocking state action could put consumers at risk and leave businesses largely unchecked.
“The David Sachs-led executive order is a gift to Silicon Valley oligarchs who are using their influence in Washington to protect themselves and their companies from accountability,” Michael Kleinman, director of U.S. policy at the Future of Life Institute, which focuses on mitigating extreme risks from transformative technologies, said in a statement.
As the Trump administration’s top AI and crypto policy official, Sachs has been a leading figure in the administration’s push to pre-empt AI.
Even supporters of a national framework admit that this order does not create a national framework. Startups could face an extended transition period because state laws can still be enforced unless a court blocks them or states suspend enforcement.
tech crunch event
san francisco
|
October 13-15, 2026
Sean Fitzpatrick, CEO of LexisNexis North America, UK and Ireland, told TechCrunch that states will defend their consumer protection powers in court and the case will likely escalate to the Supreme Court.
Supporters say the order reduces uncertainty by centralizing AI regulatory battles in Washington, while critics say legal battles will create immediate headwinds for startups navigating conflicting state and federal demands.
“Start-ups prioritize innovation, so they typically don’t have a strong regulatory governance program until the program reaches the scale it needs,” Hart Brown, lead author of the recommendations for Oklahoma Governor Kevin Stitt’s Task Force on AI and Emerging Technologies, told TechCrunch. “These programs can be expensive and time-consuming to navigate in a highly dynamic regulatory environment.”
Arul Nigam, co-founder of Circuit Breaker Labs, a startup that red-teams conversational and mental health AI chatbots, expressed similar concerns.
“There is uncertainty as to whether[AI companion and chatbot companies]need to self-regulate,” Nigam told TechCrunch, noting that the patchwork of state AI laws is having a negative impact on smaller startups in the space. “Are there open source standards to follow? Should we keep building?”
He added that he hopes Congress can move more quickly and pass a stronger federal framework.
Andrew Gamino Chong, co-founder and CTO of AI governance firm Trustible, told TechCrunch that EO’s AI innovation and pro-AI goals will backfire, saying, “Big Tech and big AI startups “Companies have the money to hire lawyers to figure out what to do, or they can simply avoid risk. Uncertainty hurts startups the most, especially those with billions in cash at their disposal.”
He added that legal ambiguity makes it difficult to sell to risk-sensitive customers such as legal teams, financial companies and medical institutions, increasing sales cycles, systems work and insurance costs. “Even the perception that AI is unregulated will reduce trust in AI,” Gamino-Chong said. Trust in AI is already low, threatening its adoption.
Gary Kivell, a partner at Davis & Gilbert, said that while companies would welcome a single national standard, “an executive order is not necessarily an appropriate way to override formal state laws.” He warned that the current uncertainty leaves the extremes of very restrictive rules and no action at all, both of which could create a “wild west” that favors big tech’s ability to absorb risk and wait out the dust.
Meanwhile, Morgan Reed, president of the App Association, urged Congress to “immediately enact a comprehensive, targeted, risk-based national AI framework. A patchwork of state AI laws is not an option, nor is a long legal battle over the constitutionality of an executive order any better.”
