Hallucination Risk
Risk scores computed from citation density, entity type, quality, content integrity, and other signals. 756 pages assessed. 164 high-risk pages.
Total Assessed
756
High Risk
164
Medium Risk
507
Low Risk
85
Avg Score
53
Most Common Risk Factors
no-citations582few-external-sources221low-quality-score172minimal-content149biographical-claims138low-rigor-score130conceptual-content128high-rigor120high-quality40structured-format31
Risk Distribution
High: 22%Medium: 67%Low: 11%
756 results
| Factors | |||||
|---|---|---|---|---|---|
| xAI | 95 | high | organization | 48 | biographical-claimsfew-external-sourcessevere-truncation |
| Ilya Sutskever | 95 | high | person | 26 | biographical-claimsno-citationslow-rigor-score+2 |
| Neel Nanda | 95 | high | person | 26 | biographical-claimsno-citationslow-rigor-score+2 |
| Nick Bostrom | 95 | high | person | 25 | biographical-claimsno-citationslow-rigor-score+2 |
| Early Warnings (1950s-2000) | 90 | high | historical | 31 | specific-factual-claimsno-citationslow-rigor-score+2 |
| The MIRI Era (2000-2015) | 90 | high | historical | 31 | specific-factual-claimsno-citationslow-rigor-score+2 |
| Astralis Foundation | 90 | high | organization | 30 | biographical-claimsno-citationslow-rigor-score+1 |
| Lightning Rod Labs | 90 | high | organization | 38 | biographical-claimsno-citationslow-rigor-score+1 |
| Gratified | 90 | high | organization | 25 | biographical-claimsno-citationslow-rigor-score+1 |
| Daniela Amodei | 90 | high | person | 21 | biographical-claimsno-citationslow-rigor-score+1 |
| Dan Hendrycks | 90 | high | person | 19 | biographical-claimsno-citationslow-rigor-score+1 |
| Connor Leahy | 90 | high | person | 19 | biographical-claimsno-citationslow-rigor-score+1 |
| Chris Olah | 90 | high | person | 27 | biographical-claimsno-citationslow-rigor-score+1 |
| Sentinel (Catastrophic Risk Foresight) | 90 | high | organization | 39 | biographical-claimsno-citationslow-rigor-score+1 |
| Jan Leike | 90 | high | person | 27 | biographical-claimsno-citationslow-rigor-score+1 |
| Common Writing Principles | 90 | high | internal | 0 | low-citation-densitylow-quality-scorefew-external-sources+1 |
| Conjecture | 85 | high | organization | 37 | biographical-claimsno-citationslow-quality-score+1 |
| CHAI (Center for Human-Compatible AI) | 85 | high | organization | 37 | biographical-claimsno-citationslow-quality-score+1 |
| Giving Pledge | 85 | high | organization | 68 | biographical-claimslow-citation-densityhigh-rigor+1 |
| FAR AI | 85 | high | organization | 76 | biographical-claimsno-citationslow-rigor-score |
| Google DeepMind | 85 | high | organization | 37 | biographical-claimsno-citationslow-quality-score+1 |
| Seldon Lab | 85 | high | organization | 45 | biographical-claimsno-citationslow-rigor-score |
| Paul Christiano | 85 | high | person | 39 | biographical-claimsno-citationslow-quality-score+1 |
| Stuart Russell | 85 | high | person | 30 | biographical-claimsno-citationslow-quality-score+1 |
| Yoshua Bengio | 85 | high | person | 39 | biographical-claimsno-citationslow-quality-score+1 |
| Rogue AI Scenarios | 85 | high | risk | 55 | low-citation-densityfew-external-sourcessevere-truncation |
| EA and Longtermist Wins and Losses | 80 | high | - | 53 | low-citation-densitysevere-truncation |
| Large Language Models | 80 | high | capability | 60 | low-citation-densitysevere-truncation |
| CSER (Centre for the Study of Existential Risk) | 80 | high | organization | 58 | biographical-claimsno-citationsfew-external-sources |
| Council on Strategic Risks | 80 | high | organization | 38 | biographical-claimsno-citationslow-quality-score |
| CAIS (Center for AI Safety) | 80 | high | organization | 42 | biographical-claimsno-citationsfew-external-sources |
| Bridgewater AIA Labs | 80 | high | organization | 66 | biographical-claimsno-citationsfew-external-sources |
| Blueprint Biosecurity | 80 | high | organization | 60 | biographical-claimsno-citationsfew-external-sources |
| AI Revenue Sources | 80 | high | organization | 55 | biographical-claimsno-citationsfew-external-sources |
| AI Impacts | 80 | high | organization | 53 | biographical-claimsno-citationsfew-external-sources |
| 1Day Sooner | 80 | high | organization | 60 | biographical-claimsno-citationsfew-external-sources |
| NTI | bio (Nuclear Threat Initiative - Biological Program) | 80 | high | organization | 60 | biographical-claimsno-citationsfew-external-sources |
| Manifest (Forecasting Conference) | 80 | high | organization | 50 | biographical-claimsno-citationsfew-external-sources |
| Lionheart Ventures | 80 | high | organization | 50 | biographical-claimsno-citationsfew-external-sources |
| Kalshi (Prediction Market) | 80 | high | organization | 25 | biographical-claimsno-citationslow-quality-score |
| IBBIS (International Biosecurity and Biosafety Initiative for Science) | 80 | high | organization | 60 | biographical-claimsno-citationsfew-external-sources |
| Eli Lifland | 80 | high | person | 58 | biographical-claimsno-citationsfew-external-sources |
| Eliezer Yudkowsky | 80 | high | person | 35 | biographical-claimsno-citationslow-quality-score |
| EA Global | 80 | high | organization | 38 | biographical-claimsno-citationslow-quality-score |
| Dario Amodei | 80 | high | person | 41 | biographical-claimsno-citationsfew-external-sources |
| Pause AI | 80 | high | organization | 59 | biographical-claimsno-citationsfew-external-sources |
| Polymarket | 80 | high | organization | 33 | biographical-claimsno-citationslow-quality-score |
| Red Queen Bio | 80 | high | organization | 55 | biographical-claimsno-citationsfew-external-sources |
| Rethink Priorities | 80 | high | organization | 60 | biographical-claimsno-citationsfew-external-sources |
| Secure AI Project | 80 | high | organization | 47 | biographical-claimsno-citationsfew-external-sources |
| SecureBio | 80 | high | organization | 65 | biographical-claimsno-citationsfew-external-sources |
| Swift Centre | 80 | high | organization | 50 | biographical-claimsno-citationsfew-external-sources |
| The Foundation Layer | 80 | high | organization | 3 | biographical-claimsno-citationslow-quality-score |
| The Sequences by Eliezer Yudkowsky | 80 | high | organization | 65 | biographical-claimsno-citationsfew-external-sources |
| Turion | 80 | high | organization | 55 | biographical-claimsno-citationsfew-external-sources |
| UK AI Safety Institute | 80 | high | organization | 52 | biographical-claimsno-citationsfew-external-sources |
| Compute Governance: AI Chips Export Controls Policy | 80 | high | policy | 58 | low-citation-densitysevere-truncation |
| Elon Musk (AI Industry) | 80 | high | person | 38 | biographical-claimsno-citationslow-quality-score |
| Geoffrey Hinton | 80 | high | person | 42 | biographical-claimsno-citationsfew-external-sources |
| Holden Karnofsky | 80 | high | person | 40 | biographical-claimsno-citationsfew-external-sources |
| Toby Ord | 80 | high | person | 41 | biographical-claimsno-citationsfew-external-sources |
| Vidur Kapur | 80 | high | person | 38 | biographical-claimsno-citationslow-quality-score |
| claim-first-architecture | 80 | high | - | - | low-citation-densitysevere-truncation |
| enhancement-queue | 75 | high | - | - | no-citationslow-rigor-scorelow-quality-score+1 |
| factors-transition-turbulence-overview | 75 | high | - | - | no-citationslow-rigor-scorelow-quality-score+1 |
| LongtermWiki Value Proposition | 75 | high | internal | 4 | no-citationslow-rigor-scorelow-quality-score+1 |
| LongtermWiki Vision | 75 | high | internal | 2 | no-citationslow-rigor-scorelow-quality-score+1 |
| LongtermWiki Strategy Brainstorm | 75 | high | internal | 4 | no-citationslow-rigor-scorelow-quality-score+1 |
| Parameters Strategy | 75 | high | internal | 3 | no-citationslow-rigor-scorelow-quality-score+1 |
| Project Roadmap | 75 | high | internal | 29 | no-citationslow-rigor-scorelow-quality-score+1 |
| scenarios-overview | 75 | high | - | - | no-citationslow-rigor-scorelow-quality-score+1 |
| CSET (Center for Security and Emerging Technology) | 75 | high | organization | 43 | biographical-claimsno-citations |
| ControlAI | 75 | high | organization | 63 | biographical-claimsno-citations |
| Coefficient Giving | 75 | high | organization | 55 | biographical-claimsno-citations |
| Coalition for Epidemic Preparedness Innovations | 75 | high | organization | 53 | biographical-claimsno-citations |
| Chan Zuckerberg Initiative | 75 | high | organization | 50 | biographical-claimsno-citations |
| Center for Applied Rationality | 75 | high | organization | 62 | biographical-claimsno-citations |
| Arb Research | 75 | high | organization | 50 | biographical-claimsno-citations |
| Apollo Research | 75 | high | organization | 58 | biographical-claimsno-citations |
| Anthropic | 75 | high | organization | 74 | biographical-claimsno-citations |
| AI Futures Project | 75 | high | organization | 50 | biographical-claimsno-citations |
| 80,000 Hours | 75 | high | organization | 45 | biographical-claimsno-citations |
| MIRI (Machine Intelligence Research Institute) | 75 | high | organization | 50 | biographical-claimsno-citations |
| Microsoft AI | 75 | high | organization | 58 | biographical-claimsno-citations |
| METR | 75 | high | organization | 66 | biographical-claimsno-citations |
| Metaculus | 75 | high | organization | 50 | biographical-claimsno-citations |
| Meta AI (FAIR) | 75 | high | organization | 51 | biographical-claimsno-citations |
| MATS ML Alignment Theory Scholars program | 75 | high | organization | 60 | biographical-claimsno-citations |
| Manifund | 75 | high | organization | 50 | biographical-claimsno-citations |
| Manifold (Prediction Market) | 75 | high | organization | 43 | biographical-claimsno-citations |
| MacArthur Foundation | 75 | high | organization | 65 | biographical-claimsno-citations |
| Long-Term Future Fund (LTFF) | 75 | high | organization | 56 | biographical-claimsno-citations |
| Longview Philanthropy | 75 | high | organization | 45 | biographical-claimsno-citations |
| Lighthaven (Event Venue) | 75 | high | organization | 40 | biographical-claimsno-citations |
| LessWrong | 75 | high | organization | 44 | biographical-claimsno-citations |
| Global Partnership on Artificial Intelligence (GPAI) | 75 | high | organization | 50 | biographical-claimsno-citations |
| GovAI | 75 | high | organization | 43 | biographical-claimsno-citations |
| Good Judgment (Forecasting) | 75 | high | organization | 50 | biographical-claimsno-citations |
| Giving What We Can | 75 | high | organization | 62 | biographical-claimsno-citations |
| FutureSearch | 75 | high | organization | 50 | biographical-claimsno-citations |
Showing 1–100 of 756
1 / 8
Live data from wiki-server
Scores computed at build time by the canonical scorer (crux/lib/hallucination-risk.ts). Run pnpm crux validate hallucination-risk for a CLI report.