Analyzing critical legal trends and developments across data, cyber, AI and digital regulations from around the world and beyond borders

House Republicans are pushing to include, as part of the Budget Reconciliation Bill (the “Bill”), a 10-year ban on AI regulation and enforcement by anyone other than the federal government. The Republican-led House Energy and Commerce Committee put forth a version of the Bill this week that includes a provision stating:

“No State or political subdivision thereof may enforce any law or regulation regulating artificial intelligence models, artificial intelligence systems, or automated decision systems during the 10-year period beginning on the date of the enactment of this Act.”

If the Bill is passed as written, the provision that would prohibit state and local governments from passing or enforcing any law that regulates AI models or systems until 2035. The provision defines “automated decision” systems as “any computational process derived from machine learning, statistical modelling, data analytics, or artificial intelligence that issues a simplified output, including a score, classification, or recommendation, to materially influence or replace human decision making.” Critics of the provision argue this would allow the moratorium on regulation to extend far beyond AI to other technologies.

Adding this to the Bill allows lawmakers to fast-track the provision, as the budget reconciliation process only requires a majority of the Senate to approve, rather than the normal 60 votes needed to pass. As such, the provision may hit a roadblock in the larger Senate vote, as reconciliation bills are supposed to only focus on fiscal issues.

While the proposal appears to exempt technology neutral laws like civil rights or consumer protection laws that regulate harmful outcomes to individuals whether facilitated by AI or otherwise, it is clear that if a state law imposes a  “substantive design, performance, data-handling, documentation, civil liability, taxation, fee, or other requirement” on covered AI systems, it would be subject to the 10-year enforcement moratorium, unless the requirement (i) is imposed under federal law or (ii) is imposed by a generally applicable law that treats non-covered models and systems “that provide comparable functions” the same as covered AI models or systems.

The Bill appears to target laws like the Colorado AI Act, which focuses on high-risk AI systems, and substantive state regulations on AI transparency, bias audits, or risk assessments would likely also be blocked under the moratorium.

The Bill includes narrow carveouts allowing enforcement only of laws that facilitate AI adoption, such as those streamlining licensing or permitting. The Bill’s impact to comprehensive state privacy laws, which have elements that also impact on AI, is unclear.

In the absence of a federal law regulation AI, states have and are considering ways to regulate AI. In 2024 alone, more than 30 states have passed laws or resolutions relating to AI. These laws include restrictions on the use of deepfakes in elections, discriminatory uses of AI in hiring, and the creation of digital replicas without consent.

This proposal comes on the heels of the firing of Shira Perlmutter, Register of Copyrights and Director of the U.S. Copyright Office, days after the U.S. copyright office released the third in its series on Copyright and Artificial Intelligence entitled Generative AI Training released as a pre-publication report. The report, which is put forth as guidance by the Copyright Office and is not legally binding, found that some forms of AI training might not qualify for the “fair use” exemption under existing copyright law. The two other reports released by the Copyright Office were “Copyright and Artificial Intelligence: Copyrightability” and “Copyright and Artificial Intelligence: Digital Replicas.”

Author

Cynthia J. Cole is Chair of Baker McKenzie’s Global Commercial, Tech and Transactions Business Unit, a member of the Firm’s global Commercial, Data, IP and Trade (CDIT) practice group steering Committee and Co-chair of Baker Women California. A former CEO and General Counsel, just before joining the Firm, Cynthia was Deputy Department Chair of the Corporate Section in the California offices of Baker Botts where she built the technology transactions and data privacy practice. An intellectual property transactions attorney, Cynthia also has expertise in AI, digital transformation, data privacy, and cybersecurity strategy.

Author

Rachel Ehlers is a partner in Baker McKenzie's Intellectual Property and Technology Practice Group, based in the Firm's Houston office. Rachel's practice focuses on technology transactions, data privacy and cybersecurity. She has extensive experience advising clients on data incidents and breach response, cross-border transfers, and data privacy and cybersecurity issues related to mergers and acquisitions.