California Just Raised the Bar for AI Companies That Want Government Contracts

POLICY

Saad Amjad

4/5/20263 min read

If you sell AI to the government, California now wants receipts.

Governor Gavin Newsom signed a first-of-its-kind executive order last week that puts new pressure on AI vendors looking for state contracts. The idea is simple: if you want California's business, you need to prove your tech has real safeguards against misuse. That means showing how your company handles harmful bias, protects civil rights, and prevents the distribution of illegal content.

It sounds like standard compliance talk. But the details matter here, and so does the context.

What the Order Actually Does

The executive order gives the California Department of General Services and the California Department of Technology 120 days to build out new vendor certification processes. AI companies that want to work with the state will need to clearly explain their policies and internal safeguards before they can get a contract approved.

This isn't a suggestion. It's a requirement baked into procurement rules.

On top of that, the order tells state officials to develop best practices for watermarking AI-generated images and video, a move that directly targets deepfakes and synthetic media created or used by government agencies. The order also opens the door for California to conduct its own supply-chain risk assessments, independent of federal decisions. That last point is more than a footnote. The Washington Post reported that this provision came after the Pentagon labeled Anthropic a supply-chain risk when the company refused to allow its AI to be used for mass surveillance. Under this order, California could look at that federal designation and simply disagree.

Why This Is a Big Deal

California is the fourth-largest economy on the planet. It's home to 33 of the world's top 50 privately held AI companies and captures roughly a quarter of all U.S. AI patents. Between Q3 2024 and Q2 2025, the Bay Area alone pulled in 51% of all U.S. AI startup funding tracked by Carta.

Walking away from California contracts just isn't realistic for most AI vendors.

That gives the state an unusual amount of leverage. When California sets a procurement standard, it doesn't just stay in California. Companies that build compliance features for one state contract tend to carry those features into their products everywhere. It's the same playbook California used with car emissions and data privacy. Write the rules for the biggest market, and the rest of the country follows.

The Federal Contrast

This order doesn't exist in a vacuum. It's a direct response to the Trump administration's approach to AI governance, which has leaned heavily toward deregulation. President Trump revoked Biden's 2023 executive order on safe AI development back in January 2025, removing mandatory red-teaming, safety reporting, and structured oversight for high-risk models. In December 2025, Trump directed the Justice Department to challenge state AI laws, but explicitly exempted state government procurement from that push.

That exemption is probably why Newsom chose procurement as his angle. Legislation can get challenged in court. Purchasing decisions are harder to override.

What This Means Going Forward

California already had more than 20 AI-related statutes take effect in January 2026, covering employment, healthcare, education, and pricing algorithms. The Transparency in Frontier AI Act, signed in September 2025, requires large AI developers to publish safety frameworks and report safety incidents, with fines up to $1 million per violation.

This executive order adds procurement as a new enforcement tool on top of all that.

For AI vendors, the takeaway is straightforward. California isn't waiting for Washington to set the rules. It's writing its own. And given the state's market position, those rules will likely shape how AI companies build their compliance programs everywhere.

For everyone else watching the AI policy space, this is worth paying attention to. Procurement standards don't get as much attention as legislation, but they can be just as effective at changing how companies behave. When the biggest buyer in the room starts demanding proof of responsible AI, the whole market feels it.

The 120-day clock is ticking.