How to use leaky ReLus as the activation function in hidden layers in pylearn2

405 Views Asked by At

I am using pylearn2 library to design a CNN. I want to use Leaky ReLus as the activation function in one layer. Is there any possible way to do this using pylearn2? Do I have to write a custom function for it or does pylearn2 have inbuilt funtions for tha? If so, how to write a custom code? Please can anyone help me out here?

1

There are 1 best solutions below

2
On BEST ANSWER

ConvElemwise super-class is a generic convolutional elemwise layer. Among its subclasses ConvRectifiedLinear is a convolutional rectified linear layer that uses RectifierConvNonlinearity class.

In the apply() method:

    p = linear_response * (linear_response > 0.) + self.left_slope *\
        linear_response * (linear_response < 0.)

As this gentle review points out:

... Maxout neuron (introduced recently by Goodfellow et al.) that generalizes the ReLU and its leaky version.

Examples are MaxoutLocalC01B or MaxoutConvC01B.

The reason for lack of answer in pylearn2-user may be that pylearn2 is mostly written by researches at LISA lab and, thus, the threshold for point 13 in FAQ may be high.