I am trying to add a ReLU activation function layer to my neural network. However when I try the following code I get this error:

ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()

I tried using:

class Relu(Activation):
    def __init__(self):
        def relu(x):
            return max(x, 0)

        def reluprime(x):
            return 1 if x > 0 else 0

        super().__init__(relu, reluprime)

I am very new to neural networks. Thank you

2

There are 2 best solutions below

0
On BEST ANSWER

Your variable x is a numpy array.

When dealing with numpy arrays, it is recommended to use numpy functions, which tend to act elementwise, rather than builtin python functions, which don't know what to do with a numpy array.

For instance, max(x, 0) makes sense if x is a number, but here x is an array, so what does it mean? How do you compare an array with 0?

Instead, use np.maximum, which will compare each element of the array with 0 and return an array.

>>> import numpy as np

>>> x = np.array([-12, 6, 0, 13])
>>> x
array([-12,   6,   0,  13])

>>> max(x, 0)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()

>>> np.maximum(x, 0)
array([ 0,  6,  0, 13])

Likewise, instead of using an 1 if x > 0 else 0 expression which makes no sense if x is an array, use numpy function heaviside:

>>> import numpy as np

>>> x = np.array([-12, 6, 0, 13])
>>> x
array([-12,   6,   0,  13])

>>> np.sign(x)
array([-1,  1,  0,  1])

>>> x > 0
array([False,  True, False,  True])

>>> 1 if x > 0 else 0
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()

>>> np.heaviside(x, 0)
array([0., 1., 0., 1.])

Relevant documentation:

0
On

I am really impressed by Stef's answer. If you need a simple answer then implement your class in the following way.

class Relu(Activation):
  def __init__(self):
    def relu(x):
        return np.maximum(x, 0)

    def reluprime(x):
        z = np.empty_like(x) 
        # create empty array similar in shape to x
        # use x to fill z.
        z[x<=0] = 0.
        z[x>0] = 1.
        return z

    super().__init__(relu, reluprime)

Relevant links:

numpy.empty_like

numpy.maximum

numpy boolean indexing