|Main Question: Is humanity doomed? (edit question) (edit non-canonical answer)|
Yes, if the superintelligence has goals which include humanity surviving then we would not be destroyed. If those goals are fully aligned with human well-being, we would in fact find ourselves in a dramatically better place.
We don't have AI systems that are generally more capable than humans. So there is still time left to figure out how to build systems that are smarter than humans in a safe way.
Unanswered non-canonical questions
If AGI is imminent, is there anything that can actually be done before it? Even if it might destroy the world, is it the best expanse of energy, seeing how there are other risks that may be less credible in terms of X-risks, but more actionable? Maybe by asking the best researchers to stop their research, but would it actually work under capitalism and how easy it seems for newcomers to revolutionize the field.