What is reward shaping?
Reward shaping, in general, can be defined as a method that grants rewards when solving tasks correctly to make learning easier. This reinforces appropriate behaviors when tackling problems since there are positives and negatives for each of them. For example, in a game, the aim is to win, not lose.
In machine learning, reward shaping is typically associated with reinforcement learning A machine learning method in which the machine gets rewards based on its actions, and is adjusted to be more likely to take actions that lead to high reward.
The reward model does have some weaknesses, like requiring lots of interactions, it may be expensive, and sparse rewards can slow down learning, but the underlying goal of an agent/AI is to maximize the rewards to attempt to reach an optimal solution to problems using this reward-shaping model.