Changing an optimisers momentum in pytorch

1.3k Views Asked by At

I know you can change the learning rate dynamically in pytorch using schedulers. How can you do the same with momentum?

1

There are 1 best solutions below

2
On

From the documentation for Multiplicative LR in PyTorch. Basically, you wrap your scheduler around the optimizer the same way you wrap your optimizer around model params.

lmbda = lambda epoch: 0.95
scheduler = MultiplicativeLR(optimizer, lr_lambda=lmbda)
for epoch in range(100):
    train(...)
    validate(...)
    scheduler.step()