Relu function as defined in keras/activation.py is:
def relu(x, alpha=0., max_value=None):
return K.relu(x, alpha=alpha, max_value=max_value)
It has a max_value which can be used to clip the value. Now how can this be used/called in the code? I have tried the following: (a)
model.add(Dense(512,input_dim=1))
model.add(Activation('relu',max_value=250))
assert kwarg in allowed_kwargs, 'Keyword argument not understood:
' + kwarg
AssertionError: Keyword argument not understood: max_value
(b)
Rel = Activation('relu',max_value=250)
same error
(c)
from keras.layers import activations
uu = activations.relu(??,max_value=250)
The problem with this is that it expects the input to be present in the first value. The error is 'relu() takes at least 1 argument (1 given)'
So how do I make this a layer?
model.add(activations.relu(max_value=250))
has the same issue 'relu() takes at least 1 argument (1 given)'
If this file cannot be used as layer, then there seems to be no way of specifying a clip value to Relu. This implies that the comment here https://github.com/fchollet/keras/issues/2119 closing a proposed change is wrong... Any thoughts? Thanks!
You can use the ReLU function of the Keras backend. Therefore, first import the backend:
Then, you can pass your own function as activation using backend functionality. This would look like
Then you can use it like
or
Unfortunately, you must hard code additional arguments. Therefore, it is better to use a function, that returns your function and passes your custom values:
Then you can pass your arguments by either
or