pose estimation on Triton inference server

330 Views Asked by At

I am struggling with running pose models in NVIDIA Triton inference server. The model (open pose , alpha pose , HRNet ... etc ) load normally but the post processing is the problem

1

There are 1 best solutions below

0
On

You can refer to the post-processing script in the docs. They have given an example for an image classifier: image_client.py

def postprocess(results, output_name, batch_size, batching):
    """
    Post-process results to show classifications.
    """

    output_array = results.as_numpy(output_name)
    if len(output_array) != batch_size:
        raise Exception("expected {} results, got {}".format(
            batch_size, len(output_array)))

    # Include special handling for non-batching models
    for results in output_array:
        if not batching:
            results = [results]
        for result in results:
            if output_array.dtype.type == np.object_:
                cls = "".join(chr(x) for x in result).split(':')
            else:
                cls = result.split(':')
            print("    {} ({}) = {}".format(cls[0], cls[1], cls[2]))