Objections and responses
You've heard about existential risks
Existential risk
A risk of human extinction or the destruction of humanity’s long-term potential.
The topic of existential risks from AI is complicated, and most simple objections to AI risk break down upon closer examination. The articles below discuss some of the most common reasons people are unsure about the risks, and why we think that this problem deserves our attention nonetheless.
Keep Reading
Start with the first entry in "Objections and responses"
How can an AGI be smarter than all of humanity?
StartOr jump to a related question