What is reward shaping?
Reward shaping, in general, can be defined as a method that grants rewards when solving tasks correctly to make learning easier. This reinforces appropriate behaviors when tackling problems since there are positives and negatives for each of them. For example, in a game, the aim is to win, not lose.
In machine learning
An approach to AI in which, instead of designing an algorithm directly, we have the system search through possible algorithms based on how well they do on some training data.
A system that can be understood as taking actions towards achieving a goal.
The reward model does have some weaknesses, like requiring lots of interactions, it may be expensive, and sparse rewards can slow down learning, but the underlying goal of an agent/AI is to maximize the rewards to attempt to reach an optimal solution to problems using this reward-shaping model.