One back-propagation pass in keras

38 Views Asked by At

I would like to train a neural network based on policy gradient method. The training involves finding the gradient of a user-defined loss (one back-propagation pass). I know gradient is automatically done during compiling as follows

model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])

However, this code does multiple forward and back passes through the NN. What I am looking for is a single back-propagation. My question is whether it is possible to do one back-propagation pass in keras or I need to do it in Pytorch or tensorflow.

0

There are 0 best solutions below