DeepStream inference on Windows

1.2k Views Asked by At

If i train a model with the NVidia DeepStream SDK on linux can i use it for inference on Windows?

I know that the SDK is not available for windows but is it necessary for inference?

I prefer a solution without docker, but also interested in dockered version.

1

There are 1 best solutions below

0
On

First of all, as far as I know, DeepStream only offers you the necessary code tools to perform inference in an optimized way on Nvidia hardware (GPU, jetson).

This means that it is not a training tool. For this you must use TLT (Transfer Learning Toolkit), which I currently understand is called TAO (Train, Adapt and Optimize).

To build DeepStream-based applications, you need the SDK. However, for deployment, the most recommended route is the Docker images offered by Nvidia at https://ngc.nvidia.com/catalog/containers. In the latter case, the SDK is not necessary since the image has everything necessary to run DeepStream applications.