Back
\$10-100M per training run
webLegal analysis from Morrison Foerster law firm on the binding private-sector obligations in Biden's 2023 AI EO, relevant to compute governance and AI reporting requirements.
Metadata
Importance: 62/100organizational reportanalysis
Summary
This Morrison Foerster client alert analyzes Biden's October 2023 AI Executive Order, focusing on its unprecedented direct obligations on private companies to disclose information about powerful AI models (trained with >10^26 FLOPs) and computing clusters to the federal government. It examines the legal basis for these compelled disclosures under the Defense Production Act and the scope of covered models and clusters.
Key Points
- •EO requires disclosure to federal government for AI models trained using >10^26 FLOPs, or >10^23 FLOPs if trained primarily on biological sequence data.
- •Companies acquiring or developing especially powerful computing clusters must also report to the government, covering hardware specs, ownership, and foreign involvement.
- •The Secretary of Commerce, in consultation with intelligence and defense officials, will define and update technical thresholds for coverage.
- •Legal authority for these private-sector mandates derives from the Defense Production Act, giving the EO binding enforcement power beyond typical agency directives.
- •Required disclosures include safety test results, red-teaming findings, and other information relevant to dual-use foundation models.
Cited by 1 page
| Page | Type | Quality |
|---|---|---|
| US Executive Order on Safe, Secure, and Trustworthy AI | Policy | 91.0 |
Cached Content Preview
HTTP 200Fetched Apr 7, 202612 KB
The AI Executive Order: Presidential Authority for Compelled Disclosures for AI Models and Computing Clusters Skip to main content Client Alert The AI Executive Order: Presidential Authority for Compelled Disclosures for AI Models and Computing Clusters
07 Nov 2023 Keep up with the latest legal and industry insights, news, and events from MoFo Sign Up Share menu
Facebook
Twitter
Print
LinkedIn
Estimated reading time: 7 minutes This client alert is one in a series of alerts on the various aspects of the executive order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence that was signed by President Biden on October 30, 2023.
On October 30, 2023, President Biden signed an executive order (EO) on the development and use of artificial intelligence. While EOs are typically directed to government officials and agencies, and the bulk of this EO is, it is noteworthy for also placing direct obligations on private actors. Specifically, pursuant to the EO, companies (1) developing especially powerful models, or (2) acquiring or developing especially powerful computing clusters, are compelled to make certain disclosures to the federal government.
Biden characterized the EO as “ the most significant action any government anywhere in the world has ever taken on AI safety, security, and trust .”
What Models and Computing Clusters Are Covered?
The secretary of commerce, in consultation with the director of national intelligence and secretaries of state, defense, and energy, is tasked with defining, and updating regularly, the technical conditions that will subject models and computing clusters to the reporting obligations described above. However, the MIT Technology Review reported that “a White House spokesperson said that the mandate will be enforceable and apply to all future commercial AI models in the US.”
Until the secretary of commerce sets the conditions for which models and computing clusters are covered, the EO requires the secretary of commerce to require compliance with reporting requirements discussed below:
Model
Any AI model that was trained:
using a quantity of computing power that exceeds 10^26 integer or floating-point operations (FLOPs); or
primarily on biological sequence data and trained using a quantity of computing power that exceeds 10^23 FLOPs.
While being a “dual-use foundation model” is not expressly listed as one of the conditions for falling within the purview of the disclosure requirements, the required disclosures (discussed below) only concern dual-use foundation models. The likely explanation for this seeming inconsistency in terminology is that any AI model that meets either of the conditions listed above necessarily is a “dual-use foundation model,” as defined in the EO. The EO defines “dual-use foundation model” as any model that:
is trained on broad data;
uses self-supervision (generally);
contains at least tens of billions of parameters;
is usable ac
... (truncated, 12 KB total)Resource ID:
45a096af2d92cae2 | Stable ID: MzRkMGUzYz