Longterm Wiki

Compute and other expenses for LLM alignment research

$400K
Funder
Recipient
Ethan Josean Perez
Program
Date
Aug 2023
Source
Notes

[Technical AI safety] 4 different projects (finding RLHF alignment failures, debate, improving CoT faithfulness, and model organisms)

Other Grants by Manifund

376
Showing 10 of 376 grants
Compute and other expenses for LLM alignment research | Grants | Longterm Wiki