Also known as: CAISI, Center for AI Standards and Innovation, US AISI, USAISI
The US AI Safety Institute (US AISI), renamed to the Center for AI Standards and Innovation (CAISI) in June 2025, is a government agency within the National Institute of Standards and Technology (NIST) established in 2023 to develop standards, evaluations, and guidelines for safe and trustworthy artificial intelligence.
Facts
5Related Wiki Pages
Top Related Pages
UK AI Safety Institute
The UK AI Safety Institute (renamed AI Security Institute in February 2025) is a government body with approximately 30+ technical staff and an annu...
METR
Model Evaluation and Threat Research conducts dangerous capability evaluations for frontier AI models, testing for autonomous replication, cybersec...
Anthropic
An AI safety company founded by former OpenAI researchers that develops frontier AI models while pursuing safety research, including the Claude mod...
OpenAI
Leading AI lab that developed GPT models and ChatGPT, analyzing organizational evolution from non-profit research to commercial AGI development ami...
Bletchley Declaration
World-first international agreement on AI safety signed by 28 countries at the November 2023 AI Safety Summit, committing to cooperation on frontie...