I am trying to get weights of the layer. It seems to work properly when keras layer is used and the input is connected to it. However, while wrapping it into my custom layer, that does not work anymore. Is that a bug or what am I missing?
Edit: considerations:
I read that one can define in build() of custom layer the trainable variables. However, since custom layer consists of keras layer Dense (and potentially more keras layers later), those should already have defined trainable variables and weight/bias initializers. (I would not see a way to overwrite them, in init() of TestLayer, with variables that would be defined in build() of TestLayer.
class TestLayer(layers.Layer):
def __init__(self):
super(TestLayer, self).__init__()
self.test_nn = layers.Dense(3)
def build(self, input_shape):
super(TestLayer, self).build(input_shape)
def call(self, inputs, **kwargs):
test_out = test_nn(inputs) # which is test_in
return test_out
test_in = layers.Input((2,))
test_nn = layers.Dense(3)
print(test_nn.get_weights()) # empty, since no connection to the layer
test_out = test_nn(test_in)
print(test_nn.get_weights()) # layer returns weights+biases
testLayer = TestLayer()
features = testLayer(test_in)
print(testLayer.get_weights()) # Problem: still empty, even though connected to input.
The documentation says that
build()
method should have calls toadd_weight()
which you do not have:You also don't need to define a dense layer inside of your class if you are subclassing
layers.Layer
. This is how you should subclass:Here are some more examples of subclassing
Layer
class.However, if you insist on implementing it your way and if you want to use
get_weights()
you have to override it (in this case you can just create a class without subclassing):