I am using the Coral devboard and the Jetson T2 devboard. In order to send a model to them, the model has to have the extension .pb
Is there a link where the models already have a .pb extension? Currently I am using this link: TF_slim
All the models have the extension .ckpt and that is all. There is no .meta, or anything else. I do not know how to convert to .pb.
I am working in Colab. This is my code:
# Now let's download the pretrained model from tensorflow's model zoo.
!mkdir /content/pretrained_model
%cd /content/pretrained_model
!wget http://download.tensorflow.org/models/inception_v4_2016_09_09.tar.gz
!tar xvf inception_v4_2016_09_09.tar.gz
#Exporting the inference graph
!python /content/models/research/slim/export_inference_graph.py \
--alsologtostderr \
--model_name=inception_v4.ckpt \
--output_file=/content/pretrained_model/inception_v4_inf_graph.pb
This is the error that I am getting:
Traceback (most recent call last):
  File "/content/models/research/slim/export_inference_graph.py", line 162, in <module>
    tf.app.run()
  File "/tensorflow-1.15.2/python3.6/tensorflow_core/python/platform/app.py", line 40, in run
    _run(main=main, argv=argv, flags_parser=_parse_flags_tolerate_undef)
  File "/usr/local/lib/python3.6/dist-packages/absl/app.py", line 299, in run
    _run_main(main, args)
  File "/usr/local/lib/python3.6/dist-packages/absl/app.py", line 250, in _run_main
    sys.exit(main(argv))
  File "/content/models/research/slim/export_inference_graph.py", line 128, in main
    FLAGS.dataset_dir)
  File "/content/models/research/slim/datasets/dataset_factory.py", line 59, in get_dataset
    reader)
  File "/content/models/research/slim/datasets/imagenet.py", line 187, in get_split
    labels_to_names = create_readable_names_for_imagenet_labels()
  File "/content/models/research/slim/datasets/imagenet.py", line 93, in create_readable_names_for_imagenet_labels
    filename, _ = urllib.request.urlretrieve(synset_url)
  File "/usr/lib/python3.6/urllib/request.py", line 248, in urlretrieve
    with contextlib.closing(urlopen(url, data)) as fp:
  File "/usr/lib/python3.6/urllib/request.py", line 223, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.6/urllib/request.py", line 532, in open
    response = meth(req, response)
  File "/usr/lib/python3.6/urllib/request.py", line 642, in http_response
    'http', request, response, code, msg, hdrs)
  File "/usr/lib/python3.6/urllib/request.py", line 564, in error
    result = self._call_chain(*args)
  File "/usr/lib/python3.6/urllib/request.py", line 504, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.6/urllib/request.py", line 756, in http_error_302
    return self.parent.open(new, timeout=req.timeout)
  File "/usr/lib/python3.6/urllib/request.py", line 532, in open
    response = meth(req, response)
  File "/usr/lib/python3.6/urllib/request.py", line 642, in http_response
    'http', request, response, code, msg, hdrs)
  File "/usr/lib/python3.6/urllib/request.py", line 570, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.6/urllib/request.py", line 504, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.6/urllib/request.py", line 650, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 404: Not Found
Thank you
 
                        
There seems to be an error in this URL in tensorflow/models. I submitted a PR tensorflow/models#9207.
Making this change will fix the 404 error.
See the instructions at https://github.com/tensorflow/models/tree/master/research/slim#exporting-the-inference-graph