Skip to content
Longterm Wiki
Index
Fact·f_jL4fG5hI6j·Fact·Jan Leike

Jan Leike — Notable For: VP of Alignment Science at Anthropic; former co-lead of OpenAI Superalignment team; prominent advocate for AI safety resource allocation

Verdictpartial85%
1 check · 4/20/2026

The source confirms two of the three claim components: (1) his role as co-lead of OpenAI's Superalignment team is confirmed, and (2) his joining Anthropic in May 2024 is confirmed. However, the source does not specify his title at Anthropic as 'VP of Alignment Science'—it only states he 'joined Anthropic' without listing a specific role. Additionally, while the source shows he was featured in Time's AI 100 list (suggesting prominence in AI safety), it does not explicitly characterize him as a 'prominent advocate for AI safety resource allocation.' The claim is partially supported but lacks complete verification of all three elements.

Our claim

entire record
Subject
Jan Leike
Property
Notable For
Value
VP of Alignment Science at Anthropic; former co-lead of OpenAI Superalignment team; prominent advocate for AI safety resource allocation

Source evidence

1 src · 1 check
partial85%primaryHaiku 4.5 · 4/20/2026

NoteThe source confirms two of the three claim components: (1) his role as co-lead of OpenAI's Superalignment team is confirmed, and (2) his joining Anthropic in May 2024 is confirmed. However, the source does not specify his title at Anthropic as 'VP of Alignment Science'—it only states he 'joined Anthropic' without listing a specific role. Additionally, while the source shows he was featured in Time's AI 100 list (suggesting prominence in AI safety), it does not explicitly characterize him as a 'prominent advocate for AI safety resource allocation.' The claim is partially supported but lacks complete verification of all three elements.

Case № f_jL4fG5hI6jFiled 4/20/2026Confidence 85%