I have a neural network like this:
const model = tf.sequential();
model.add(tf.layers.dense({activation: "sigmoid", units: 50, inputShape: [150]}));
model.add(tf.layers.dense({activation: "sigmoid", units: 50}));
model.add(tf.layers.dense({activation: "softmax", units: 5}));
The output layer outputs 5 probabilities that should sum to 1.
But now I want to add two additional neurons to the output layer which outputs are not probabilities and shouldn't be normalized as softmax does it. So they must have another activation function and still be a part of the same output layer. How can I achieve that?
I guess I have to create a custom layer and use it in place of my current softmax layer. But I don't understand how to do this correctly.