I am working around with a DNN in tf.Keras, which looks like as follows:
# Construct the DNN model with 2 inputs, 2 outputs and 3 hidden layers
c0_input = Input(shape=(1,), name="c0")
c1_input = Input(shape=(1,), name="c1")
# Concatenate the input into one layer
tensor_input = Concatenate(axis=-1)([c0_input, c1_input])
hidden_1 = Dense(100)(tensor_input)
activation_1 = LeakyReLU(alpha=0.1)(hidden_1)
hidden_2 = Dense(100)(activation_1)
activation_2 = LeakyReLU(alpha=0.1)(hidden_2)
hidden_3 = Dense(100)(activation_2)
activation_3 = LeakyReLU(alpha=0.1)(hidden_3)
# 2 outputs are named as x0 and x1
x0_output = Dense(1, name="x0")(activation_3)
x1_output = Dense(1, name="x1")(activation_3)
# The model
DNN_model = Model(inputs=[c0_input, c1_input], outputs=[x0_output, x1_output])
As you can see, this DNN has 2 inputs (c0, c1)
and 2 outputs (x0, x1)
. The loss function I am aiming at is: c0 * (x0 - x1**2)**2 + c1 * (x1 - c0 * x0)**2
, which includes the both inputs and the both outputs. Here are my questions:
- How can I write a loss function, which takes into account all
c0, c1, x0, x1
? I have tried to work around with the custom loss function in Keras, but it looks like it is not correct to slice and extractx0
andx1
fromy_pred
(which should be a part of the loss function). - How can I fit the training data ? In this case we have an array training data for
c0
and another forc1
. - If this is hard to achieve with Keras, is there any recommendation for any other packages which are easier to deal with ?
Many thanks for reading and answering my question. I have tried to play with custom loss and weight loss, but it doesn't seem to help so far.
You can use the
tf.keras.layers.Concatenate
class and settingaxis=-1
andtrainable=False
will fix your problem. I too have the same experience of writing a custom loss function for multi inputs and multi outputs.Here are changes I made to your existing code
Custom code function
Generate dummy data to test
Fitting model:
Predict code:
Tested code..