Wouldn't a superintelligence be smart enough to avoid misunderstanding our instructions?
While a superintelligence
An AI with cognitive abilities far greater than those of humans in a wide range of important domains.
As an analogy: humans are smart enough to understand some of our own “programming”. For example, we know that natural selection "gave" us the urge to have sex so that we would reproduce.1 A system that can be understood as taking actions towards achieving a goal.
Evolution does not act as an agent that “decides” something, but evolution is analogous in some ways to stochastic gradient descent in that it shaped the way that we, the agents, act. The relevance of this analogy is debated by some researchers. ↩︎