How can I work on assessing AI alignment projects and distributing grants?
It’s hard to get into this field, even though it’s a place where AI alignment
AI safety
A research field about how to prevent risks from advanced artificial intelligence.
To do well here, you also need to have a strong inside-view understanding of the field. It can be helpful to actively seek people to offer grants to, and after making grants, to write follow-ups in places like the EA Forum.