Goodhart's Law is very important for managers to understand. At it's core, Goodhart's Law can colloquially be summed up with this:
Give someone a score, and they will try to make it go up.
The more formal version:
Once a measure becomes a target, it no longer is a reliable measure.
The main thesis here is that a measure is likely to be gamed when it becomes a strategic target. This is especially true if there are consequences tied to the changes of that measure.
Imagine that you are measuring "number of sales calls completed" as a way of determining your overall sales base effort, and you have a stated interest in increasing that number to some particular target point. In this scenario, your sales associates may be incentivized to game the measure by simply rushing their sales calls. The measure is no longer useful, and in fact has unintended negative consequences.