You do not have permission to edit this page, for the following reason:
The action you have requested is limited to users in the group: Users.
A value handshake is a form of trade between superintelligences, when two AI's with incompatible utility functions meet, instead of going to war, since they have superhuman prediction abilities and likely know the outcome before any attack even happens, they can decide to split the universe into chunks with volumes according to their respective military strength or chance of victory, and if their utility functions are compatible, they might even decide to merge into an AI with an utility function that is the weighted average of the two previous ones.
This could happen if multiple AI's are active on earth at the same time, and then maybe if at least one of them is aligned with humans, the resulting value handshake could leave humanity in a pretty okay situation.
See The Hour I First Believed By Scott Alexander for some further thoughts and an introduction to related topics.