LongtermWiki Project
LongtermWiki Project
Section titled “LongtermWiki Project”Planning and strategy documents for LongtermWiki — a strategic intelligence platform for AI safety prioritization.
Documents
Section titled “Documents”Core Concepts
Section titled “Core Concepts”- Critical Insights — Framework for identifying high-value knowledge contributions
Vision & Strategy
Section titled “Vision & Strategy”- Vision Document — 2-person-year scope, architecture, milestones
- Strategy Brainstorm — Failure modes, definitions of success, strategic options
Research
Section titled “Research”- Similar Projects Analysis — Lessons from Arbital, Stampy, EA Forum Wiki, MIT AI Risk Repository, and others
Quick Summary
Section titled “Quick Summary”What is LongtermWiki?
A decision-support tool for funders, researchers, and policymakers asking: “Where should the next marginal dollar or researcher-hour go?”
Core components:
- Knowledge Base (~30% effort) — Risks, interventions, models
- Crux Graph (~25% effort) — Key uncertainties and dependencies
- Worldview → Priority Mapping (~20% effort) — How assumptions lead to prioritizations
- Disagreement Decomposition (~15% effort) — Turn fuzzy disagreements into resolvable questions
- Living Document Infrastructure (~10% effort) — Staleness tracking, freshness
Key lessons from similar projects:
- Paid contributors > volunteer-only (EA Forum Wiki, Stampy fellowship)
- Narrow focus > comprehensive scope (Stampy FAQ vs Arbital)
- Platform integration > standalone wiki (LessWrong/EAF wikis)
- Single editorial owner is essential