Why does IoU as metrics for Semantic Segmentation raise a values error in Keras?

80 Views Asked by At

Applying a binary semantic segmentation model ( "unet", tensorflow.keras) with classes 0 and 1, works well when compiled as follows, with metrics "accuracy" or "Binary Crossentropy"

unet.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=1e-3), loss=tf.keras.losses.BinaryCrossentropy(from_logits=True), metrics = [tf.keras.metrics.BinaryIoU(target_class_ids=[1], name="IoU")])

It does not work when I attempt to use BinaryIoU as metrics as in the code above In that case, model.fit raises a value mismatch error. The traceback concludes as follows:

Node: 'confusion_matrix/stack_1' Shapes of all inputs must match: values[0].shape = [4194304] != values[1].shape = [8388608] [[{{node confusion_matrix/stack_1}}]] [Op:__inference_train_function_24084]

In other words, there is a factor of 2 difference in shape of values[0] and values[1]. I suspect that values[0] pertains to the true mask and values[1] to the predicted mask. But I do not understand sufficiently Keras model.fit to know how the error originates.

The shape of my image and masks is (512, 512, 1). Masks dtype is uint16. Thanks for educating me.

0

There are 0 best solutions below