NVidia DeepStream Output Inference Class Mismatch - "Vehicle Class"

729 Views Asked by At

• Hardware Platform (Jetson / GPU) Jetson Nano 4GB, Ubuntu 18.4

• DeepStream Version marketplace.azurecr.io/nvidia/deepstream-iot2-l4t:latest

• JetPack Version 4.3

• Issue Type Output inference class is different from Model class

• How to reproduce the issue ? On DeepStream, deploy a object detection ONNX model. My model is ONNX model exported from Azure Custom Vision. My label file has 2 classes - 'Mask', 'No_Mask'. Deployment works fine and I am able to execute my model using DeepStream. However, output inference class I am getting as 'Vehicle' and 'No_Mask'. Can you please help me understand why I am getting output inference label as "Vehicle" when it is not there in my Model.

Sample output inference log {"log":" "1|324|23|380|61|Vehicle|#|||||||0"\n","stream":"stdout","time":"2021-01-05T16:15:15.614591738Z"}

{"log":" "1|324|23|380|61|Vehicle|#|||||||0"\n","stream":"stdout","time":"2021-01-05T16:15:15.614790179Z"}

{"log":" "2|141|15|365|161|No Mask"\n","stream":"stdout","time":"2021-01-05T16:15:15.614221209Z"}

1

There are 1 best solutions below

3
On

You've most probably specified wrong labels file or the classes in it are wrong. It's provided in labelfile-path as

labelfile-path=labels.txt