I introduced the following lines in my deep learning project in order to early stop when the validation loss has not improved for 10 epochs:
if best_valid_loss is None or valid_loss < best_valid_loss:
best_valid_loss = valid_loss
counter = 0
else:
counter += 1
if counter == 10:
break
Now I want to use Optuna to tune some hyperparameters, but I don't really understand how pruning works in Optuna. Is it possible for Optuna pruners to act the same way as in the code above? I assume I have to use the following:
optuna.pruners.PatientPruner(???, patience=10)
But I don't know which pruner I could use inside PatientPruner. Btw in Optuna I'm minimizing the validation loss.
Short answer: Yes.
Hi, I'm one of the authors of
PatientPruner
in Optuna. If we perform vanilla early-stopping,wrapped_pruner=None
works as we expected. For example,The output will be
pruned at 15
.