Why models often benefit from reducing the learning rate during training

1.3k Views Asked by At

In Keras official documentation for ReduceLROnPlateau class (https://keras.io/api/callbacks/reduce_lr_on_plateau/) they mention that

"Models often benefit from reducing the learning rate"

Why is that so? It's counter-intuitive for me at least, since from what I know- a higher learning rate allows taking further steps from my current position.

Thanks!

1

There are 1 best solutions below

3
On BEST ANSWER

Neither too high nor too low learning rate should be considered for training a NN. A large learning rate can miss the global minimum and in extreme cases can cause the model to diverge completely from the optimal solution. On the other hand, a small learning rate can stuck to a local minimum.

ReduceLROnPlateau purpose is to track your model's performance and reduce the learning rate when there is no improvement for x number of epochs. The intuition is that the model approached a sub-optimal solution with current learning rate and oscillate around the global minimum. Reducing the learning rate would enable the model to take smaller learning steps to the optimal solution of the cost function.

enter image description here Image source