Serving keras model with tensorflow serving error

31 Views Asked by At

I have created a model using keras which is working locally. But after upgrade of tensorflow to the 2.17.0 I start receiving strange error on the tsserve side.

Model was serialized so:

import tensorflow as tf 

# tf.version.VERSION == 2.17.0

tf.saved_model.save(
    model,
    export_dir="../models/jager/61",
    #signatures={"serving_default": export_model(preprocessor,model)},
)

Serving model I am using docker image of tensorflow/serving:nightly (tried also with latest, but it has tensor flow version 2.14.0 only)

    root@23556ef83203:/# tensorflow_model_server --version
    TensorFlow ModelServer: 0.0.0+nightly.sha.no_git
    TensorFlow Library: 2.17.0

After sending predict request to the model I receive some format error:

Could not find variable sequential_20/dense_41/bias. This could mean that the variable has been deleted. In TF1, it can also mean the variable is uninitialized. Debug info: container=localhost, status error message=Resource localhost/sequential_20/dense_41/bias/N10tensorflow3VarE does not exist.\n\t [[{{function_node __inference_serving_default_474794}}{{node sequential_20_1/dense_41_1/add/ReadVariableOp}}]]

This looks like a version conflict, but I tried a lot of different versions with similar errors.

0

There are 0 best solutions below