President Donald Trump has signed a sweeping new executive order intended to create a unified national approach to artificial intelligence regulation. But instead of resolving the growing tension between state and federal rules, legal experts warn the directive could trigger prolonged uncertainty especially for startups that lack the resources to navigate conflicting laws.
The order, titled “Ensuring a National Policy Framework for Artificial Intelligence,” tasks federal agencies with challenging state-level AI regulations that the administration views as overly burdensome. Within 30 days, the Department of Justice must establish a task force to argue that AI constitutes interstate commerce and should therefore fall under federal control. Meanwhile, the Commerce Department has 90 days to identify “onerous” state AI laws that could influence a state’s eligibility for federal funding, including broadband grants.
The order also directs the FTC and FCC to evaluate potential federal standards that could override state rules, while urging Congress to craft a single nationwide framework something lawmakers have struggled to agree on.
A Push for Consistency or a Path to Greater Confusion?
The order arrives amid a fierce national debate about how AI should be regulated. Supporters argue that state-by-state rules create costly compliance burdens, while critics warn that removing state authority without a federal law in place could leave consumers vulnerable.
Michael Kleinman of the Future of Life Institute sharply criticized the policy, calling it “a gift for Silicon Valley oligarchs” and accusing AI czar David Sacks of shielding large tech companies from accountability. Even advocates of national regulation admit the order does not resolve the underlying issue: states can still enforce their laws unless courts intervene.
Legal experts like Sean Fitzpatrick of LexisNexis predict state attorneys general will defend their authority, setting up court battles that could ultimately reach the Supreme Court. This means startups may face years of uncertainty while waiting for definitive guidance.
Startups Fear Costly Delays and Compliance Challenges
Founders and AI governance professionals warn the order may unintentionally harm the very companies it claims to help. Smaller startups often lack dedicated compliance teams and rely on clear, stable rules to plan product development.
“Startups don’t have robust regulatory governance programs until they reach scale,” said Hart Brown, who helped shape Oklahoma’s AI policy. Arul Nigam of Circuit Breaker Labs added that companies are unsure whether to self-regulate, follow open-source standards, or pause development altogether.
A Risk of Slowed Adoption and Reduced Trust
Experts like Trustible co-founder Andrew Gamino-Cheong argue that regulatory ambiguity will make it harder to sell AI tools to sectors like finance, healthcare, and law industries that demand clear legal guardrails. Without clarity, he warns, “even the perception that AI is unregulated will reduce trust.”
A Call for Congressional Action
Business groups and policy leaders, including Morgan Reed of The App Association, stress that only Congress can deliver a durable, risk-based national AI framework. Until then, the U.S. may remain caught between a patchwork of state laws and a potentially lengthy battle over the constitutionality of Trump’s executive action.