"ValueError: numpy.ndarray size changed " while trying Intel lpot in tensorflow model

179 Views Asked by At

While trying out the Intel Low Precision Optimization Tool in tensorflow model, getting some value error.

Please find the command I tried below:

# The cmd of running ssd_resnet50_v1
bash run_tuning.sh --config=ssd_resnet50_v1.yaml --input_model=/tmp/ssd_resnet50_v1_fpn_shared_box_predictor_640x640_coco14_sync_2018_07_03/frozen_inference_graph.pb --output_model=./tensorflow-ssd_resnet50_v1-tune.pb

By running, I am getting the below error:

import pycocotools._mask as _mask File "pycocotools/_mask.pyx", line 1, in init pycocotools._mask ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 88 from C header, got 80 from PyObject

Providing the git hub link I followed: https://github.com/intel/neural-compressor/tree/master/examples/tensorflow/object_detection

1

There are 1 best solutions below

0
Athira - Intel On BEST ANSWER

Please try to upgrade numpy version. Command is given below:

pip install --upgrade numpy

I had this same issue and what I did was upgrade numpy, which resolved the issue.

Thanks