The author of California’s SB 1047, The Nation’s Most Controversial Ai Safety Bill of 2024, is back with a new ai bill that count shake up silicon valley.
California State Senator Scott Wisener Introduced A New Bill On Friday that would protect employees at Leading ai labs, allowing them to spek out they think their company’s ai systems ai systems group be a “Critical Risk” to Socite. The New Bill, SB 53, would also create a public cloud computing cluster, Called Calcompute, to give results and startups the Necessary Computing Resources to Develop Ai That Benefits the Publike.
Wisener’s Last Ai Bill, California’s SB 1047, Sparked A Lively Debate Across the Country Around How to Handle Massive Ai Systems that Could Cause Disasters. SB 1047 Aimed To Preventing the possibility of very large ai models creating catastrophic eventsSuch as causing loss of life or cyberattacks costing more than $ 500 million in damages. However, Governor Gavin Newsom Ultimately Vetoed The Bill in September, Saying SB 1047 was not the best approach,
But the debate over SB 1047 Quickly Turned Ugly. Some silicon valley leaders said SB 1047 would hurt America’s competitive edge In the global ai race, and claimed the bill was inspired by unrealistic fears that ai systems could bring about Science fiction fiction-Like Doomsday Scenarios. Meanwhile, Senator wise alleged that some venture capitalists engaged in a “Propaganda Campaign” Against His BillPointing in part to y combinator’s claim that sb 1047 would send startup founders to jail, a claim experts argued wasleading.
SB 53 Essentially takes the least controversial parts of SB 1047 – Such as Whistleblower Protections and the Establishment of a Calcompite Cluster – And Repackages Them Into a Bill.
Notable, wisener is not shying away from existing ai risk in sb 53. The new bill specifically protects whistleblowers who belongeve their employees are creating Risk. ” The bill defines critical risk as a ,Foreseable or Material Risk That A Developer’s Development, Storage, Or Development of a Foundation Model, AS Defined, Will Result in the Death of, or Serious Injury to, more than 100 Peeple, or than 100 Peeple, or more than $ 1 billion in in Damage to rights in money or property. “
SB 53 Limits Frontier Ai Model Developers – Likely Including Openai, Anthropic, And Xai, Among others – from retaliating against employees who disclose concerting information information to calforaneia ‘ General, Federal Authorities, or other Employees. Under the bill, these developers would be required to report back to whistleblowers
As for Calcompute, SB 53 would establish a group to build out a public cloud computing cluster. The group would consult of University of California representatives, as well as other public and private results. It would make recommendations for how to build calcompute, how large the cluster should be, and which users and organizations should have access to it.
Of course, it’s very early in the legislative process for sb 53. The bill needs to be reviewed and passed by california’s Legislative Bodies Before it Reaches Governor Newsom ”. State Lawmakers Will Surely Be Waiting for Silicon Valley’s Reaction to SB 53.
However, 2025 may be a tougher year to pass ai safety bills compared to 2024. California passed 18 AI-Related Bills in 2024, but now It seems as if the ai doom movement has lost ground,
Vice President JD Vance Signaled at the Paris Ai Action Summit that America is not interested in ai safety, but raather prioritizes ai innovation. While The Calcompute Cluster Establed by SB 53 Blad Surely Be Seen as Advanceing Ai Progress, IT’s Unclear How Legislative Afforts Around Existential Ai Risk Will FARL FARE in 2025.