I want to constrain the parameters of an intermediate layer in a neural network to prefer discrete values: -1, 0, or 1. The idea is to add a custom objective function that would increase the loss if the parameters take any other value. Note that, I want to constrain parameters of a particular layer, not all layers.
How can I implement this in pytorch? I want to add this custom loss to the total loss in the training loop, something like this:
custom_loss = constrain_parameters_to_be_discrete
loss = other_loss + custom_loss
May be using a Dirichlet prior might help, any pointer to this?
You can use the loss function:
This graph plots the proposed loss for a single element:

As you can see, the proposed loss is zero for
x={-1, 0, 1}and positive otherwise.Note that if you want to apply this loss to the weights of a specific layer, then your
xhere are the weights, not the activations of the layer.