How to export tflite model maker model_spec for offline usage?

82 Views Asked by At

I'm using a tflite model maker efficientdet_lite3 model for object detection. The basic set up goes like this:

import numpy as np
import os
from tflite_model_maker.config import ExportFormat, QuantizationConfig
from tflite_model_maker import model_spec
from tflite_model_maker import object_detector
from tflite_support import metadata
import tensorflow as tf

assert tf.__version__.startswith('2')
tf.get_logger().setLevel('ERROR')
from absl import logging
logging.set_verbosity(logging.ERROR)


train_data = object_detector.DataLoader.from_pascal_voc(
    'my_data/train',
    'my_data/train',
    ['obj']
)

val_data = object_detector.DataLoader.from_pascal_voc(
    'my_data/validate',
    'my_data/validate',
    ['obj']
)

spec = model_spec.get('efficientdet_lite3')

model = object_detector.create(train_data, model_spec=spec, batch_size=4, train_whole_model=True, epochs=200, validation_data=val_data)

And it works fine on a machine with internet access and even yields some usable results. But on a machine without internet access I am getting an urlopen error which I assume is caused by the fact that model spec is being fetched from the internet.

Now this model itself can be found and downloaded here: https://www.kaggle.com/models/tensorflow/efficientdet/frameworks/tensorFlow2/variations/lite3-detection But I have no idea how do create a custom model_spec object for training.

Can you please help me setting up this project for a completely offline usage? This is important for me because the training data is on the offline computer connected to a closed network and transfering this data every time to other computer is a painful process. It would've been much easier to transfer the necessary tflite data to the offline machine.

I've tried using this spec: https://www.tensorflow.org/lite/api_docs/python/tflite_model_maker/object_detector/EfficientDetLite3Spec and pointing it to a local saved_model.pb file but I get all sorts of weird errors like:

ERROR:absl:hub.KerasLayer is trainable but has zero trainable weights.

    ValueError: Could not find matching concrete function to call loaded from the SavedModel. Got:
      Positional arguments (1 total):
        * <tf.Tensor 'imgs:0' shape=(None, 512, 512, 3) dtype=float32>
      Keyword arguments: {}
    
     Expected these arguments to match one of the following 1 option(s):

1

There are 1 best solutions below

0
On

Got the same issue still... The only way I've got around is to train with online access and then load the saved model offline in your edge device.

You can also try to use only tensorflow 2.0 and then export it to tflite format. Using this work around you could load offline the weights of the efficientdet model