When building Sequential model, I notice there is a difference between adding relu layer and LeakyReLU layer.
test = Sequential()
test.add(Dense(1024, activation="relu"))
test.add(LeakyReLU(0.2))
- Why cant we add layer with activation = "
LeakyReLU" ? (LeakyReLU is not a string which keras can work with) - When adding
relulayer, we set the number of units (1024 in my example) Why can't we do the same forLeakyReLU?
I was sure that the different between relu and LeakyReLU is the method behavior, but it seems more than that.
We could specify the activation function in the dense layer itself, by using aliases like
activation='relu', which would use the default keras parameters for relu. There is no such aliases available in keras, for LeakyRelu activation function. We have to usetf.keras.layers.LeakyReluortf.nn.leaky_relu.We cannot set number of units in Relu layer, it just takes the previous output tensor and applies the relu activation function on it. You have specified the number of units for the Dense layer not the relu layer. When we specify
Dense(1024, activation="relu")we multiply the inputs with weights, add biases and apply relu function on the output (all of this is mentioned on a single line). From the method mentioned on step 1, this process is done in 2 stages firstly to multiply weights, add biases and then to apply the LeakyRelu activation function (mentioned in 2 lines).