Chiron's question on Killer Robot Arms Race
Collorary question: how do you actually restrict AI research? With nukes you need quite large and sophisticated facilities to refine the raw elements, but AI can be developed by anyone with enough computing power, something that will only become more achievable as time goes on.
Computing power is not necessarily the only bottleneck until we have AGI, it seems to me that it will take a significant amount of research time to be able to actually engineer a powerful enough system. (If it won't then this question becomes a lot more difficult ["Every 18 months, the minimum IQ necessary to destroy the world drops by one point."- Yudkowsky-Moore law])If we could convince everyone in the AI field that alignment should be top priority restricting research could be enforced through funding (this is a big IF at the moment of course). It is something with some precedent, it is widely agreed that genetic engineering of humans should not be pursued and it is therefore impossible to get research grants for research in that area, some lone researchers have done some things in the area, but without funding access it is very difficult for them to do anything with far reaching consequences.
OriginWhere was this question originally asked
|YouTube (comment link)|
|On video:||The other "Killer Robot Arms Race" Elon Musk should worry about|
|Asked on Discord?||Yes|