xrunc coreml model error for YOLACT onnx with no priors layer and softmax layer

349 Views Asked by At

I have convert the YOLACT pytorch model to onnx with no softmax and priors layers. And I try to convert onnx to coreml afterward. The terminal shows it's done with no mistake. It also display model compilation done, and no error as below.

210/211: Converting Node Type Concat 211/211: Converting Node Type Concat Translation to CoreML spec completed. Now compiling the CoreML model. Model Compilation done.

But as I compile coreml model on macos, error shows:

xcrun coremlc compile yolact_test_nosoftmax_simplify.mlmodel

coremlc: Error: compiler error: Espresso exception: Invalid blob shape generic_elementwise_kernel: cannot broadcast [18, 18, 128, 1, 399] and [35, 35, 128, 1, 399]

I have no idea how to debug right now. Any suggestion will be appreciated.

1

There are 1 best solutions below

5
On

There is an operation in your model that tries to apply an operation to a tensor of size (18, 18, 128, 1, 399) and a tensor of size (35, 35, 128, 1, 399). These two tensor shapes are not compatible, hence the error message.

To solve this, you need to find out at which point in your model this happens and then fix the issue. It might be something that went wrong with the PyTorch -> ONNX conversion, or that goes wrong with the ONNX -> Core ML conversion.