OpenVINOInferencer on GPU

56 Views Asked by At

I've trained a model (Padim with resnet 50_2) with the anomalib package (python). The recall & precision are quite good, so I want to implement it for a demo.

Currently I'm using this code:

inferencer = OpenVINOInferencer(
    path=openvino_model_path,
    metadata=metadata_path,
    device="CPU",
)

predictions = inferencer.predict(image=image)

This works perfectly but only gets around 40 fps. When I try to use it with "gpu", the build fails (as I have Nvidia).

My application requires 1000 - 2000 fps. (which I've been able to do with CNN's).

How can I do interference on the GPU?

My training process outputs multiple model file types:

- model.bin
- model.onnx
- model.xml
- metadata.json
1

There are 1 best solutions below

0
Aznie_Intel On BEST ANSWER

OpenVINO toolkit is officially supported by Intel hardware only. OpenVINO toolkit does not support other hardware, including Nvidia GPU.

You may use any supported Intel hardware to run with GPU, refer to System Requirements