Recently, I’ve been working on machine learning using Tensorflow 2. My code requires the tfa.optimizers.extend_with_decoupled_weight_decay()
function implemented in the Tensorflow-addons package (TFA).
However, during execution, the code raises a rare exception to which I cannot find a resolution or explanation anywhere on the internet.
Any help clarifying the problem would be much appreciated!
Here is the specifications of the platforms and versions I am using:
OS: Windows 11
Python: 3.10.9
Tensorflow: 2.13.0
Tensorflow Addons: 0.21.0
The example code found on Tensorflow's website
this raises the exception, similarly to the code in my project:
import tensorflow as tf
import tensorflow_addons as tfa
# MyAdamW is a new class
MyAdamW = tfa.optimizers.extend_with_decoupled_weight_decay(tf.keras.optimizers.Adam)
# Create a MyAdamW object
optimizer = MyAdamW(weight_decay=0.001, learning_rate=0.001)
# update var1, var2 but only decay var1
optimizer.minimize(loss, var_list=[var1, var2], decay_variables=[var1])
The error message:
Traceback (most recent call last):
File "test.py", line 8, in <module>
optimizer = MyAdamW(weight_decay=0.001, learning_rate=0.001)
File "C:AppData\Local\Programs\Python\Python310\lib\site-packages\typeguard\__init__.py", line 1033, in wrapper
retval = func(*args, **kwargs)
File "C:AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\optimizers\weight_decay_optimizers.py", line 369, in __init__
super().__init__(weight_decay, *args, **kwargs)
File "C:AppData\Local\Programs\Python\Python310\lib\site-packages\typeguard\__init__.py", line 1033, in wrapper
retval = func(*args, **kwargs)
File "C:AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\optimizers\weight_decay_optimizers.py", line 99, in __init__
self._set_hyper("weight_decay", wd)
AttributeError: 'OptimizerWithDecoupledWeightDecay' object has no attribute '_set_hyper'
Suspecting an incompatibility with the addons package.
I tried downgrading it to 0.20.0
and 0.19.0
, but the same exception appeared without any change.