What training programs and courses are available for AGI safety?

From Stampy's Wiki

Alternate Phrasings

  • Where can I get educated in alignment?

Canonical Answer

  • AGI safety fundamentals (technical and governance) - Is the canonical AGI safety 101 course. 3.5 hours reading, 1.5 hours talking a week w/ facilitator for 8 weeks.
  • Refine - A 3-month incubator for conceptual AI alignment research in London, hosted by Conjecture.
  • AI safety camp - Actually do some AI research. More about output than learning.
  • SERI ML Alignment Theory Scholars Program SERI MATS - Four weeks developing an understanding of a research agenda at the forefront of AI alignment through online readings and cohort discussions, averaging 10 h/week. After this initial upskilling period, the scholars will be paired with an established AI alignment researcher for a two-week ‘research sprint’ to test fit. Assuming all goes well, scholars will be accepted into an eight-week intensive scholars program in Berkeley, California.
  • Principles of Intelligent Behavior in Biological and Social Systems (PIBBSS) - Brings together young researchers studying complex and intelligent behavior in natural and social systems.
  • Safety and Control for Artificial General Intelligence - An actual AI Safety university course (UC Berkeley). Touches multiple domains including cognitive science, utility theory, cybersecurity, human-machine interaction, and political science.

See also, this spreadsheet of learning resources.

Stamps: None
Show your endorsement of this answer by giving it a stamp of approval!

Tags: contributing, stub, education (create tag), plex's answer to what are some good resources on ai alignment? (create tag) (edit tags)

Question Info
Asked by: plex
OriginWhere was this question originally asked
Wiki
Date: 2022/08/14

Related questions


Discussion