Keras custom loss - Combining loss of multiple branch

1.8k Views Asked by At

I have a 2 branch network where one branch outputs regression value and another branch outputs classification label.

model = Model(inputs=inputs, outputs=[output1, output2])    
model.compile(loss=[my_loss_reg, my_loss_class], optimizer='adam')

I want to implement a custom loss function (my_loss_reg()) for the regression branch such that at the regression end I want to add a fraction of the classification loss as follows,

def my_loss_reg(y_true, y_pred):
        loss_mse=K.mean(K.sum(K.square(y_true-y_pred)))
        #loss_reg = calculate_classification_loss() # How to implement this? 
        final_loss = some_function(loss_mse, loss_reg) # Can calculate only if loss_reg is available
        return final_loss

The y_true and y_pred are true and predicted regression values at the regression branch. To calculate the classifcation loss I need the true and predicted classifcation labels, which is not available in my_loss_reg().

My question is how to calculate or access the classifcation loss at the regression end of the network? Similarly, I want to get the regression loss at the classification end while calulating the custom loss function my_loss_class() for the classification.

How can I do that? Any code snippets will be helpful. I found this solution but this is no longer valid with the latest version of Tensorflow and Keras.

1

There are 1 best solutions below

5
On

All you need is simply available in native keras

you can automatically combine multiple losses using loss_weights parameter

In the example below I tried to reproduce a task where I combined an mse loss for the regression and a sparse_categorical_crossentropy for the classification task

features,n_sample,n_class = 10, 200, 3

X = np.random.uniform(0,1, (n_sample,features))
y = np.random.randint(0,n_class, n_sample)

inp = Input(shape=(features,))
x = Dense(64, activation='relu')(inp)
hidden = Dense(16, activation='relu')(x)
x = Dense(64, activation='relu')(hidden)
out_reg = Dense(features, name='out_reg')(x) # output regression
x = Dense(32, activation='relu')(hidden)
out_class = Dense(n_class, activation='softmax', name='out_class')(x) # output classification

model = Model(inp, [out_reg,out_class])
model.compile(optimizer='adam', 
              loss = {'out_reg':'mse', 'out_class':'sparse_categorical_crossentropy'},
              loss_weights = {'out_reg':1., 'out_class':0.5})

model.fit(X, [X,y], epochs=10)

In this specific case, the loss is the result of 1*out_reg + 0.5*out_class

if you want to put your custom losses you simply have to do in this way

def my_loss_reg(y_true, y_pred):
    return ...

def my_loss_class(y_true, y_pred):
    return ...

model.compile(optimizer='adam', 
              loss = {'out_reg':my_loss_reg, 'out_class':my_loss_class},
              loss_weights = {'out_reg':1., 'out_class':0.5})

model.fit(X, [X,y], epochs=10)