Expert Positions2 topics
| Topic | View | Estimate | Confidence | Date | Source | Source check |
|---|---|---|---|---|---|---|
| How Hard Is Alignment? | Very hard — focus on control instead | Easier to assume misalignment and control for it | high | 2025 | 80,000 Hours Podcast | |
| Will Advanced AI Be Deceptive? | Significant risk | ~30% probability of scheming before escape attempt | medium | Apr 2024 | AXRP Episode 27 — AI Control (2024) |
Links
Facts
2Biographical
Notable ForAI safety research; Redwood Research leadership
General
Websitehttps://redwoodresearch.org