Also known as: MIRI, Singularity Institute, Singularity Institute for Artificial Intelligence, SIAI
The Machine Intelligence Research Institute (MIRI) is one of the oldest organizations focused on AI existential risk, founded in 2000 as the Singularity Institute for Artificial Intelligence (SIAI).
Key Metrics
Revenue (ARR)
Headcount
Facts
10Other Data
| Dimension | Rating | Evidence | Assessor | |
|---|---|---|---|---|
| current-strategy | Policy advocacy to halt AI development | Major 2024 pivot after acknowledging alignment research 'extremely unlikely to succeed in time' [MIRI About](https://intelligence.org/about/) | editorial | |
| field-impact | Controversial but influential | Raised awareness but faced criticism for theoretical approach and failed research programs [LessWrong](https://www.lesswrong.com/posts/rfNHWe5JWhGuSqMHN/steelmanning-miri-critics) | editorial | |
| financial-status | Operating at deficit with ~2 year runway | $4.97M net loss in 2024, $15.24M in net assets [ProPublica](https://projects.propublica.org/nonprofits/organizations/582565917) | editorial | |
| historical-significance | First organization to focus on ASI alignment as technical problem | Among first to recognize ASI as most important event in 21st century [MIRI About](https://intelligence.org/about/) | editorial | |
| research-output | Minimal recent publications | Near-zero new publications from core researchers between 2018 and 2022 [LessWrong](https://www.lesswrong.com/posts/rfNHWe5JWhGuSqMHN/steelmanning-miri-critics) | editorial |
| Title | Date | EventType | Description | Significance | |
|---|---|---|---|---|---|
| Strategic pivot away from alignment research | 2024 | pivot | 2024 announcement; current focus on attempting to halt development of increasingly general AI models via discussions with policymakers about extreme risks. | major | |
| $4.3M Ethereum donation from Vitalik Buterin | 2021-05 | funding | Contributed to a revenue spike to $25.6M in 2021. | major | |
| Largest single Open Philanthropy grant — $7.7M | 2020-04 | funding | $6.24M from main OP funders + $1.46M from BitMEX co-founder Ben Delo. At this peak, OP provided ~60% of MIRI's predicted budgets for 2020-2021. | major | |
| Open Philanthropy two-year general support grant ($2.65M) | 2019-02 | funding | Provided $2,652,500 over two years; OP support grew from $1.4M (2018) to $2.31M (2019). | moderate | |
| Renamed to Machine Intelligence Research Institute | 2013-01 | pivot | — | major | |
| Sold name, web domain, and Singularity Summit to Singularity University | 2012-12 | pivot | Marked the end of the public-outreach phase. | major | |
| First Singularity Summit | 2006 | launch | Annual summit organized in cooperation with Stanford University, with funding from Peter Thiel. | moderate | |
| Reorientation toward AI safety | 2005 | pivot | Yudkowsky's concerns about superintelligent AI risks prompted a fundamental reorientation toward AI safety. Organization also relocated from Atlanta to Silicon Valley that year. | major | |
| Singularity Institute for Artificial Intelligence founded | 2000 | founding | Founded by Eliezer Yudkowsky with the original (paradoxical) mission of accelerating AI development. | major |
Divisions
1Core technical research on mathematical foundations of AI alignment, including agent foundations and decision theory
Prediction Markets
10 activeRelated Wiki Pages
Top Related Pages
Eliezer Yudkowsky
Co-founder of MIRI, early AI safety researcher and rationalist community founder
Sharp Left Turn
The Sharp Left Turn hypothesis proposes that AI capabilities may generalize discontinuously to new domains while alignment properties fail to trans...
Alignment Research Center (ARC)
AI safety research nonprofit operating as ARC Theory, investigating fundamental alignment problems including Eliciting Latent Knowledge and heurist...
Instrumental Convergence
Instrumental convergence is the tendency for AI systems to develop dangerous subgoals like self-preservation and resource acquisition regardless of...
The MIRI Era
The formation of organized AI safety research, from the Singularity Institute to Bostrom's Superintelligence