Close Menu
MMJ News NetworkMMJ News Network
  • Home
  • Cannabis
  • Psychedelics
  • Crypto & Web3
  • AI
  • CBD
  • Wellness & Counterculture
  • MMJNEWS

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Self-Evolving AI Agents Can ‘Unlearn’ Safety, Study Warns

October 2, 2025

Bitcoin Displays Disturbing CME Gap, Here’s What Happens If The Gap Closes

October 2, 2025

UK Government Wants to Keep $7 Billion in Stolen Bitcoin It Has Seized

October 1, 2025
Facebook X (Twitter) Instagram
MMJ News NetworkMMJ News Network
  • Home
  • About Us
  • Advertise With Us
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
  • Home
  • Cannabis
  • Psychedelics
  • Crypto & Web3
  • AI
  • CBD
  • Wellness & Counterculture
  • MMJNEWS
MMJ News NetworkMMJ News Network
Home » California’s new AI safety law shows regulation and innovation don’t have to clash 
Tech

California’s new AI safety law shows regulation and innovation don’t have to clash 

EditorBy EditorOctober 1, 2025No Comments5 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link


SB 53, the AI safety and transparency bill that California Gov. Gavin Newsom signed into law this week, is proof that state regulation doesn’t have to hinder AI progress.  

So says Adam Billen, vice president of public policy at youth-led advocacy group Encode AI, on today’s episode of Equity. 

“The reality is that policy makers themselves know that we have to do something, and they know from working on a million other issues that there is a way to pass legislation that genuinely does protect innovation — which I do care about — while making sure that these products are safe,” Billen told TechCrunch. 

At its core, SB 53 is a first-in-the-nation bill that requires large AI labs to be transparent about their safety and security protocols — specifically around how they prevent their models from catastrophic risks, like being used to commit cyberattacks on critical infrastructure or build bio-weapons. The law also mandates that companies stick to those protocols, which will be enforced by the Office of Emergency Services.  

“Companies are already doing the stuff that we ask them to do in this bill,” Billen told TechCrunch. “They do safety testing on their models. They release model cards. Are they starting to skimp in some areas at some companies? Yes. And that’s why bills like this are important.” 

Billen also noted that some AI firms have a policy around relaxing safety standards under competitive pressure. OpenAI, for example, has publicly stated that it may “adjust” its safety requirements if a rival AI lab releases a high-risk system without similar safeguards. Billen argues that policy can enforce companies’ existing safety promises, preventing them from cutting corners under competitive or financial pressure. 

While public opposition to SB 53 was muted in comparison to its predecessor SB 1047, which Newsom vetoed last year, the rhetoric in Silicon Valley and among most AI labs has been that almost any AI regulation is anathema to progress and will ultimately hinder the U.S. in its race to beat China.  

Techcrunch event

San Francisco
|
October 27-29, 2025

It’s why companies like Meta, VCs like Andreessen Horowitz, and powerful individuals like OpenAI president Greg Brockman are collectively pumping hundreds of millions into super PACs to back pro-AI politicians in state elections. And it’s why those same forces earlier this year pushed for an AI moratorium that would have banned states from regulating AI for 10 years.  

Encode AI ran a coalition of more than 200 organizations to work to strike down the proposal, but Billen says the fight isn’t over. Senator Ted Cruz, who championed the moratorium, is attempting a new strategy to achieve the same goal of federal preemption of state laws. In September, Cruz introduced the SANDBOX Act, which would allow AI companies to apply for waivers to temporarily bypass certain federal regulations for up to 10 years. Billen also anticipates a forthcoming bill establishing a federal AI standard that would be pitched as a middle-ground solution but would in reality override state laws. 

He warned that narrowly scoped federal AI legislation could “delete federalism for the most important technology of our time.” 

“If you told me SB 53 was the bill that would replace all the state bills on everything related to AI and all of the potential risks, I would tell you that’s probably not a very good idea and that this bill is designed for a particular subset of things,” Billen said.  

Adam Billen, vice president of public policy, Encode AIImage Credits:Encode AI

While he agrees that the AI race with China matters, and that policymakers need to enact regulation that will support American progress, he says killing state bills — which mainly focus on deepfakes, transparency, algorithmic discrimination, children’s safety, and governmental use of AI — isn’t the way to go about doing that. 

“Are bills like SB 53 the thing that will stop us from beating China? No,” he said. “I think it is just genuinely intellectually dishonest to say that that is the thing that will stop us in the race.” 

He added: “If the thing you care about is beating China in the race on AI — and I do care about that — then the things you would push for are stuff like export controls in Congress,” Billen said. “You would make sure that American companies have the chips. But that’s not what the industry is pushing for.” 

Legislative proposals like the Chip Security Act aim to prevent the diversion of advanced AI chips to China through export controls and tracking devices, and the existing CHIPS and Science Act seeks to boost domestic chip production. However, some major tech companies, including OpenAI and Nvidia, have expressed reluctance or opposition to certain aspects of these efforts, citing concerns about effectiveness, competitiveness, and security vulnerabilities.  

Nvidia has its reasons — it has a strong financial incentive to continue selling chips to China, which has historically represented a significant portion of its global revenue. Billen speculated that OpenAI could hold back on chip export advocacy to stay in the good graces of crucial suppliers like Nvidia. 

There’s also been inconsistent messaging from the Trump administration. Three months after expanding an export ban on advanced AI chips to China in April 2025, the administration reversed course, allowing Nvidia and AMD to sell some chips to China in exchange for 15% of the revenue. 

“You see people on the Hill moving towards bills like the Chip Security Act that would put export controls on China,” Billen said. “In the meantime, there’s going to continue to be this propping up of the narrative to kill state bills that are actually quite light tough.” 

Billen added that SB 53 is an example of democracy in action — of industry and policymakers working together to get to a version of a bill that everyone can agree on. It’s “very ugly and messy,” but “that process of democracy and federalism is the entire foundation of our country and our economic system, and I hope that we will keep doing that successfully.” 

“I think SB 53 is one of the best proof points that that can still work,” he said. 



Source link

AI policy ai safety Equity SB 1047 sb 53
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Editor
  • Website
  • Facebook
  • Instagram

Related Posts

Google updates its Home app with Gemini smarts

October 1, 2025

Meet the end-of-life planning startup co-founded by NBA All-Star Russell Westbrook

October 1, 2025

California just drew the blueprint for AI safety regulation with SB 53

October 1, 2025
Leave A Reply Cancel Reply

Don't Miss
Crypto & Web3

Self-Evolving AI Agents Can ‘Unlearn’ Safety, Study Warns

In brief Agents that update themselves can drift into unsafe actions without external attacks. A…...

Free Membership Required

You must be a Free member to access this content.

Join Now

Already a member? Log in here

Bitcoin Displays Disturbing CME Gap, Here’s What Happens If The Gap Closes

October 2, 2025

UK Government Wants to Keep $7 Billion in Stolen Bitcoin It Has Seized

October 1, 2025

CharacterAI removes Disney characters after receiving cease-and-desist letter

October 1, 2025
Top Posts

Wisconsin GOP Lawmakers Move to Legalize Medical Cannabis in 2025

October 1, 2025

Trump Promotes Hemp-Derived CBD For Senior Health Care in Shared Video

September 30, 2025

Steel Your Cannabis Crops Against Iron Deficiency

September 29, 2025

Massachusetts Initiative Petition to Kill Adult-Use Market Leads CBT’s Top Stories in September

September 26, 2025

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

About Us
About Us

Welcome to MMJ News Network, your premier source for cutting-edge insights into cannabis, psychedelics, crypto & Web3, wellness, counterculture, and market trends. We are dedicated to bringing you the latest news, research, and developments shaping these fast-evolving industries.

Facebook X (Twitter) Pinterest YouTube WhatsApp
Our Picks

Self-Evolving AI Agents Can ‘Unlearn’ Safety, Study Warns

October 2, 2025

Bitcoin Displays Disturbing CME Gap, Here’s What Happens If The Gap Closes

October 2, 2025

UK Government Wants to Keep $7 Billion in Stolen Bitcoin It Has Seized

October 1, 2025
Most Popular

Ethereum Falls as Crypto Exchange Bybit Confirms $1.4 Billion Hack

February 21, 2025

Florida Woman Accused of $850K Trump Solana Meme Coin Theft, Faces Deportation

February 21, 2025

Bitcoin, XRP and Dogecoin Sink Amid Inflation Fears and Bybit Hack Fallout

February 23, 2025
  • Home
  • About Us
  • Advertise With Us
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2025 mmjnewsnetwork. Designed by mmjnewsnetwork.

Type above and press Enter to search. Press Esc to cancel.