Adam optimizer report error in chainer?

701 Views Asked by At

version: chainer 2.0.2 I use Adam optimizer ,then report error, I found it was caused by this code(fix1==0?): in adam.py:

@property
    def lr(self):
        fix1 = 1. - math.pow(self.hyperparam.beta1, self.t)
        fix2 = 1. - math.pow(self.hyperparam.beta2, self.t)
        return self.hyperparam.alpha * math.sqrt(fix2) / fix1

error log:

Traceback (most recent call last):
  File "AU_rcnn/train.py", line 237, in <module>
    main()
  File "AU_rcnn/train.py", line 233, in main
    trainer.run()
  File "/root/anaconda3/lib/python3.6/site-packages/chainer/training/trainer.py", line 285, in run
    initializer(self)
  File "/root/anaconda3/lib/python3.6/site-packages/chainer/training/extensions/exponential_shift.py", line 48, in initialize
    self._init = getattr(optimizer, self._attr)
  File "/root/anaconda3/lib/python3.6/site-packages/chainer/optimizers/adam.py", line 121, in lr
    return self.hyperparam.alpha * math.sqrt(fix2) / fix1
ZeroDivisionError: float division by zero
2

There are 2 best solutions below

0
On BEST ANSWER

Use "alpha" attribute to control learning rate for Adam in Chainer. "lr" is defined as built-in property, it should not be override by other value.

Set "alpha" as an attribute for ExponentialShift (official doc) as well to decay learning rate, if you use Adam optimizer.

from chainer.optimizers import Adam
optimizer = Adam(alpha=0.001)

# --- Define trainer here... ---

trainer.extend(extensions.ExponentialShift("alpha", 0.99, optimizer=optimizer), trigger=(1, 'epoch'))
1
On

I have same ploblem, and tried corochann's approach. However, it didn't slove the ploblem.


my chainer version 2.1.0 Used code is https://github.com/chainer/chainer/blob/master/examples/cifar/train_cifar.py be changed L57 into "optimizer = chainer.optimizers.Adam()".