Back
Orrick: California Looks to Regulate Cutting-Edge Frontier AI Models: 5 Things to Know
webSB 1047 was a high-profile 2024 California bill that sparked major debate in the AI industry; it was ultimately vetoed by Governor Newsom in September 2024, but remains influential as a model for future AI legislation.
Metadata
Importance: 52/100blog postanalysis
Summary
Orrick law firm provides a legal analysis of California's SB 1047, a landmark bill proposing safety requirements for developers of large frontier AI models. The piece outlines key provisions including safety assessments, incident reporting, and kill-switch requirements, as well as potential compliance burdens and constitutional questions. It serves as a practical overview for businesses navigating the proposed regulatory landscape.
Key Points
- •SB 1047 targets developers of large AI models (above a compute threshold) operating in California, requiring safety and security protocols before deployment.
- •Key requirements include conducting hazard assessments, implementing emergency shutdown capabilities, and reporting serious safety incidents to a new state oversight body.
- •The bill creates potential liability for developers if their models are used to cause mass casualties or critical infrastructure attacks, even by third parties.
- •Critics raise concerns about compliance costs for smaller developers and startups, as well as ambiguity in key definitions like 'covered model' and 'reasonable care'.
- •The legislation reflects a broader trend of state-level AI regulation in the absence of comprehensive federal law, with significant implications for the global AI industry.
Cited by 1 page
| Page | Type | Quality |
|---|---|---|
| Safe and Secure Innovation for Frontier Artificial Intelligence Models Act | Policy | 66.0 |
Cached Content Preview
HTTP 200Fetched Apr 9, 202620 KB
California Looks to Regulate Cutting-Edge Frontier AI Models: 5 Things to Know About SB-1047
Home
Insights
California Looks to Regulate Cutting-Edge Frontier AI Models: 5 Things to Know About SB-1047
Facebook
Twitter
Linkedin
California Looks to Regulate Cutting-Edge Frontier AI Models: 5 Things to Know About SB-1047
6 minute read | July.19.2024
On May 21, 2024, the California Senate passed the Safe and Secure Innovation for Frontier Artificial Intelligence Models Act (SB-1047), and referred the bill to the State Assembly. Though the bill has already been amended in the Assembly and may be subject to further amendment, SB-1047 would impose substantial compliance obligations on entities involved in training the most powerful and well-funded artificial intelligence models.
The bill seeks to:
Regulate AI based on the amount of money and computing power used to train a model.
Impose an array of compliance obligations on developers of covered models.
Regulate operators of computing clusters used to train a covered model.
Grant expansive labor protections for employees of developers, contractors and subcontractors.
Confer oversight powers to new and existing regulators.
In More Detail
1. The bill would regulate AI models based on the amount of money and computing power used to train the model.
The bill defines a “covered model” as an AI model “trained using a quantity of computing power greater than 10^26 integer or floating-point operations, the cost of which exceeds one hundred million dollars ($100,000,000) when calculated using the average market prices of cloud compute at the start of training as reasonably assessed by the developer.” The definition also encompasses covered models fine-tuned using computing power equal to or greater than three times 10^25 integer or floating-point operations.
A floating-point operation is any mathematical operation that uses floating-point numbers, i.e., numbers with decimals, not binary integers. Floating-point operations per second, or FLOPS, is a unit of measurement used to quantify the computing power of a computer or a processor.
The compute threshold for models trained using a computing power greater than 10^26 integer or FLOPS is the same used to impose reporting requirements under the Biden Administration’s executive order on AI .
The bill also seeks to regulate “covered model derivatives,” which include copies of covered models, modified versions of covered models, and covered models combined with other software.
The bill would create a state agency called the Frontier Model Division, a new division of the CA Government Operations Age
... (truncated, 20 KB total)Resource ID:
c4e37ccc42ea467d | Stable ID: sid_n9mMnKmw7R