AppliedMathematician's question on The Orthogonality Thesis
Well, that you can never derive an ought from an is, is not necessarily true. You might be able to define a self-aware-structure as is-statement and its understanding of the word "ought" as well. If X is an sentient being, and Y is its interpretation function for all ought-statements, you can apply the interpretation function given as is-statement to a is-state-description of the world, which will result in what X ought do. Now you would have to argue that the ought-interpretation-function must contain fundamental ought-rules. That is, that, it must contain fundamentals to apply a kind of calculus of ought statements. But this seems not to be true, it could be just a neural network, that at most has "ought" semantics as emergent properties ... Its "ought" is just the "feeling" to do something given a certain input. It may even generate a number of different "feeling" signals associated with some strength level an have a indeterminate priority selection semantics. In any case there would be an is-description of the neural network ...
Emergentism can be a pain for this and associated issues. For example there is no strict naturalistic basis for a concept like the soul. In a scientific sense souls do not exist. That is at least a common claim, but is it true? Maybe it is just not known how a soul can be an emergent structure of things that do exist, like brains.
OriginWhere was this question originally asked
|YouTube (comment link)|
|On video:||Intelligence and Stupidity: The Orthogonality Thesis|
|Asked on Discord?||Yes|