Skip to content

LongtermWiki Project

Planning and strategy documents for LongtermWiki — a strategic intelligence platform for AI safety prioritization.

  • Critical Insights — Framework for identifying high-value knowledge contributions
  • Vision Document — 2-person-year scope, architecture, milestones
  • Strategy Brainstorm — Failure modes, definitions of success, strategic options
  • Similar Projects Analysis — Lessons from Arbital, Stampy, EA Forum Wiki, MIT AI Risk Repository, and others

What is LongtermWiki?

A decision-support tool for funders, researchers, and policymakers asking: “Where should the next marginal dollar or researcher-hour go?”

Core components:

  1. Knowledge Base (~30% effort) — Risks, interventions, models
  2. Crux Graph (~25% effort) — Key uncertainties and dependencies
  3. Worldview → Priority Mapping (~20% effort) — How assumptions lead to prioritizations
  4. Disagreement Decomposition (~15% effort) — Turn fuzzy disagreements into resolvable questions
  5. Living Document Infrastructure (~10% effort) — Staleness tracking, freshness

Key lessons from similar projects:

  • Paid contributors > volunteer-only (EA Forum Wiki, Stampy fellowship)
  • Narrow focus > comprehensive scope (Stampy FAQ vs Arbital)
  • Platform integration > standalone wiki (LessWrong/EAF wikis)
  • Single editorial owner is essential