Leaky-ReLU back propagation with numpy

544 Views Asked by At

I wanted to implement the Leaky ReLU activation function with numpy (forward and backward pass) and wanted to get some comments about whether this implementation is correct.

So the Leaky ReLU(x) = x if x > 0 and alpha * x if x <= 0.

This means, the derivative is: 1 if x > 0 and alpha if x <= 0.

This is my code:

import numpy as np

alpha = 0.2
mask = None

def forward(x):
    global mask
    mask = x > 0
    ret = x
    ret[~mask] = ret[~mask] * alpha
    return ret

def backward(error):
    ret = np.ones(shape=error.shape)
    ret[~mask] = alpha
    ret = np.multiply(ret, error)
    return ret

And error is the tensor from the upper layer which is passed downward.

0

There are 0 best solutions below