Could this function be sped up using some Numpy trick?

80 Views Asked by At

I have a personal implementation of a feed-forward neural network. If you are familiar with them then you know about the forward pass. It is inherently iterative, i. e., the results of the last computation (previous layer's output) is used in the next one. I have this function for forward feeding, i. e., the forward pass of an examples vector.

    def forward_pass(self, example, keep_track=True):
        curr_layer = example.flatten()
        outputs = np.empty(self.L() - 1, dtype=np.ndarray)  # z^(l)
        activations = np.empty(self.L(), dtype=np.ndarray)  # a^(l)
        activations[0] = curr_layer
        for W, b, l in zip(self.W(), self.b(), range(self.L() - 1)):
            outputs[l] = (W @ curr_layer) + b
            if self.is_reg and (l == self.L() - 2):  # last layer regression
                activations[l + 1] = outputs[l]
                continue  # we should be done
            activations[l + 1] = self.act(outputs[l])
            curr_layer = activations[l + 1]
        return (outputs, activations) if keep_track else activations[-1]

As we can see, there is a loop involved. I can't imagine there being a way to get rid of it and do this in a more vectorised way but I have been surprised in past as to what can be accomplished with Numpy.

Do you see a way to speed up this function?

0

There are 0 best solutions below