Assertion Error: Framework is not detected correctly from model format

422 Views Asked by At

I am trying Intel Low precision Optimization tool and I am following this github(https://github.com/intel/lpot/tree/master/examples/tensorflow/object_detection).When I run the quantization command as below

bash run_tuning.sh --config=ssd_mobilenet_v1.yaml --input_model=ssd_mobilenet_v1_coco_2018_01_28/frozen_inference_graph.pb --output_model=./tensorflow-ssd_mobilenet_v1-tune.pb

I am getting the below error.

Traceback (most recent call last):
  File "main.py", line 60, in <module>
    evaluate_opt_graph.run()
  File "main.py", line 48, in run
    quantizer.model = self.args.input_graph
  File "/home/uxxxxx/.conda/envs/lpot/lib/python3.7/site-packages/lpot/experimental/component.py", line 334, in model
    self._model = Model(user_model)
  File "/home/uxxxxx/.conda/envs/lpot/lib/python3.7/site-packages/lpot/experimental/common/model.py", line 43, in __new__
    assert False, 'Framework is not detected correctly from model format.'
AssertionError: Framework is not detected correctly from model format.

Please help!

1

There are 1 best solutions below

0
Abhijeet - Intel On

This is because your input model path needs some correction. Use ./ssd instead of ssd. It worked for me. Regards