Tensorflow serving, get different outcome

448 Views Asked by At

I am using tensorflow serving to serve a pre-trained model. The strange thing is when I input same data for this model, I got different outcome each time.

I thought it might be my problem at variable initialize, I am wondering is there any clue I debug my model, or how can I find the cause, thanks.

1

There are 1 best solutions below

0
On

Two common problems:

  1. There's a known issue with main_op in which variables are re-initialized to random.
  2. You left dropout layers in your prediction graph.

To address (1), use this instead:

def main_op():
  init_local = variables.local_variables_initializer()
  init_tables = lookup_ops.tables_initializer()
  return control_flow_ops.group(init_local, init_tables)

To address (2), be sure that you aren't directly exporting your training graph. You need to build a new graph for prediction/serving. If you are using the tf.estimator framework, then you will only conditionally add dropout layers when mode is tf.estimator.ModeKeys.TRAIN.