TLDR: Can someone show how to create LSTM, convert it to TFLite, and run it in android version 1.15?
I am trying to create a simple LSTM model and run in in android application with tensorflow v115.
** It is the same case when using GRU and SimpleRNN layers **
Creating simple LSTM model
I am working in Python, trying two tensorflow and keras versions: LATEST (2.4.1 with built-in keras), and 1.1.5 (and I install keras version 2.2.4).
I create this simple model:
model = keras.Sequential()
model.add(layers.Embedding(input_dim=1000, output_dim=64))
model.add(layers.LSTM(128))
model.add(layers.Dense(10))
model.summary()
Saving it
I save it in both "SavedModel" and "h5" format:
model.save(f'output_models/simple_lstm_saved_model_format_{tf.__version__}', save_format='tf')
model.save(f'output_models/simple_lstm_{tf.__version__}.h5', save_format='h5')
Converting to TFLite
I try create & save the model in both v115 and v2 versions.
Then, I try to convert it to TFLite in several methods.
In TF2:
- I try to convert from keras model:
converter = tf.lite.TFLiteConverter.from_keras_model(model)
tflite_model = converter.convert()
with open(f"output_models/simple_lstm_tf_v{tf.__version__}.tflite", 'wb') as f:
f.write(tflite_model)
- I try to convert from saved model:
converter_saved_model = tf.lite.TFLiteConverter.from_saved_model(saved_model_path)
tflite_model_from_saved_model = converter_saved_model.convert()
with open(f"{saved_model_path}_converted_tf_v{tf.__version__}.tflite", 'wb') as f:
f.write(tflite_model_from_saved_model)
- I try to convert from keras saved model (h5) - I try to use both tf.compat.v1.lite.TFLiteConverter and tf..lite.TFLiteConverter.
converter_h5 = tf.compat.v1.lite.TFLiteConverter.from_keras_model_file(h5_model_path)
# converter_h5 = tf.lite.TFLiteConverter.from_keras_model_file(h5_model_path) # option 2
tflite_model_from_h5 = converter_h5.convert()
with open(f{h5_model_path.replace('.h5','')}_converted_tf_v1_lite_from_keras_model_file_v{tf.__version__}.tflite", 'wb') as f:
f.write(tflite_model_from_h5)
Android Application
build.gradle (Module: app)
When I want to use v2, I use:
implementation 'org.tensorflow:tensorflow-lite-task-vision:0.0.0-nightly'
implementation 'org.tensorflow:tensorflow-lite-task-text:0.0.0-nightly'
When I want to use v115, I use implementation 'org.tensorflow:tensorflow-lite:1.15.0'
in the build grade.
Then, I follow common tflite loading code in android:
private MappedByteBuffer loadModelFile(Activity activity) throws IOException {
AssetFileDescriptor fileDescriptor = activity.getAssets().openFd(getModelPath());
FileInputStream inputStream = new FileInputStream(fileDescriptor.getFileDescriptor());
FileChannel fileChannel = inputStream.getChannel();
long startOffset = fileDescriptor.getStartOffset();
long declaredLength = fileDescriptor.getDeclaredLength();
return fileChannel.map(FileChannel.MapMode.READ_ONLY, startOffset, declaredLength);
}
LoadLSTM(Activity activity) {
try {
tfliteModel = loadModelFile(activity);
} catch (IOException e) {
e.printStackTrace();
}
tflite = new Interpreter(tfliteModel, tfliteOptions);
Log.d(TAG, "*** Loaded model *** " + getModelPath());
}
When I use v2, the model is loaded.
When I use the v115, in ALL of the options i've tried, I receive errors as the following:
A/libc: Fatal signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x70 in tid 17686 (CameraBackgroun), pid 17643 (flitecamerademo)
I need a simple outcome - create LSTM and make it work in android v115.
What am I missing? Thanks