I'm trying to convert a pretrained model for us with Tensorflow.js:
I chose mask_rcnn_inception_v2_coco
.
tensorflowjs_converter
expects specific output_node_names
. Various resources on the web point me to tools like summarize_graph
to help with inspecting potential output node names.
Unfortunately, I'm running this on Google Colab, and (from what I can tell) I can't install bazel, which I need to build summarize_graph
from source, which I need to identify which output_node_names
to pass to the converter.
Am I missing something here? Is there a more straight forward way to go from an existing pretrained model to Tensorflow.js (for inference on the browser)?
This article helped me: https://developer.arm.com/technologies/machine-learning-on-arm/developer-material/how-to-guides/optimizing-neural-networks-for-mobile-and-embedded-devices-with-tensorflow/determine-the-names-of-input-and-output-nodes
Get tensorboard up and running (can be done in windows or whatever, this is an ubuntu install)
>
>
(if you don't have import_pb_to_tensorboard.py you can just download file https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/tools/import_pb_to_tensorboard.py and point to on whatever directory you save it)
You use the above command to import you model into the tensorboard you are serving. In tensorboard you can click into the model and see what the final output is called. Mine is called "import/final_result"