The Ape Machine's question on Instrumental Convergence

From Stampy's Wiki
The Ape Machine's question on Instrumental Convergence id:Ugxe0dZ-3oHRoO88oPB4AaABAg

The Ape Machine's question on Instrumental Convergence

You know just as well as I do that the guy who collects stamps will not just buy some stamps, he will build The Stamp Collector, and you have just facilitated the end of all humanity :( I would like to ask, on a more serious note, do you have any insights on how this relates to how humans often feel a sense of emptiness after achieving all of their goals. Or, well, I fail to explain it correctly, but there is this idea that humans always need a new goal to feel happy right? Maybe I am completely off, but what I am asking is, yes in an intelligent agent we can have simple, or even really complex goals, but will it ever be able to mimic the way goals are present in humans, a goal that is not so much supposed to be achieved, but more a fuel to make progress, kind of maybe like: a desire?

Question Info
Asked by: The Ape Machine
OriginWhere was this question originally asked
YouTube (comment link)
On video: Why Would AI Want to do Bad Things? Instrumental Convergence
Date: 2018-03-25T13:42
Asked on Discord? No
YouTube Likes: 4
Reply count: 2


Discussion