I want to train a binary target deep neural network model using nsl.keras.GraphRegularization
as described in this tutorial. My model has a triplet semihard loss in an intermediate dense layer which should not be "graph regularized".
From the nsl.keras.GraphRegularization
definition on Github:
Incorporates graph regularization into the loss of
base_model
.Graph regularization is done on the logits layer and only during training.
It means that the intermediate triplet semihard loss will not be affected by this regularization?
Yes, that's right. Graph regularization will only be applied on the outputs of
base_model
. If yourbase_model
uses triplet semihard loss in another layer, that loss should remain unaffected and preserved. If that's the not the case, please file a bug at https://github.com/tensorflow/neural-structured-learning/issues.