TensorFlow v2: Model Key not found during deserialization

36 Views Asked by At

I am working with TensorFlow v2.8.0, along with Xilinx Vitis-AI. Vitis-AI has a wrapper to convert models into hardware friendly DPU instructions, during this process it de-serializes the model to check for compatibility. However, in my case it is unable to find a specific layer in the model.

Model Compilation works fine w/ the warning, so I compile it manually.

model = tf.keras.models.load_model('Res18_RFSoC/UT_HAR_cvt_ResNet18.h5')
optimizer = tf.keras.optimizers.Adam(learning_rate=1e-3)
loss_fn tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True)
model.compile(optimizer=optimizer,
          loss=loss_fn,
          metrics=['accuracy'])
model.build((None, 1, 250, 90))
 
WARNING:tensorflow:No training configuration found in the save file, so the model was *not* compiled. Compile it manually.

Passing the model through the inspector however raises an Error with a key for a layer not found:

[VAI INFO] Update include_bias_corr: False
[VAI INFO] Update include_fast_ft: False
[VAI INFO] Update include_cle: False
[VAI INFO] VitisPof2SOptimizeTransformsPipeline configs:
[VAI INFO] - remove_dropout: True
[VAI INFO] - separate_conv_act: True
[VAI INFO] - fold_conv_bn: True
[VAI INFO] - convert_bn_to_dwconv: True
[VAI INFO] - convert_relu6_to_relu: False
[VAI INFO] - convert_tf_op_to_keras: True
[VAI INFO] - include_cle: False
[VAI INFO] - cle_to_relu6: False
[VAI INFO] - cle_steps: 10
[VAI INFO] - cle_balance_method: max
[VAI INFO] - cle_weight_threshold: 0.1
[VAI INFO] - train_with_bn: False
 
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
/tmp/ipykernel_112/1786654664.py in <module>
      6                         dump_results=True,
      7                         dump_results_file="inspect_results.txt",
----> 8                         verbose=50)
 
/opt/vitis_ai/conda/envs/vitis-ai-tensorflow2/lib/python3.7/site-packages/tensorflow_model_optimization/python/core/quantization/keras/vitis/vitis_inspect.py in inspect_model(self, float_model, input_shape, plot, plot_file, dump_model, dump_model_file, dump_results, dump_results_file, verbose, configs, **kwargs)
    562           candidate_layers=self._candidate_layers,
    563           layer_metadata=layer_metadata,
--> 564           quantize_strategy=self._quantize_strategy)
    565 
    566       logger.debug('Quantize Pipeline Configurations:')
 
/opt/vitis_ai/conda/envs/vitis-ai-tensorflow2/lib/python3.7/site-packages/tensorflow_model_optimization/python/core/quantization/keras/vitis/vitis_quantize.py in create_optimize_model(model, candidate_layers, layer_metadata, quantize_strategy)
   1002   optimize_pipeline = quantize_strategy.get_optimize_pipeline()
   1003   optimized_model, layer_metadata = optimize_pipeline.apply(
-> 1004       model, candidate_layers, layer_metadata)
   1005   return optimized_model, layer_metadata
   1006 
 
/opt/vitis_ai/conda/envs/vitis-ai-tensorflow2/lib/python3.7/site-packages/tensorflow_model_optimization/python/core/quantization/keras/vitis/quantize_strategy/pof2s/vitis_pof2s_transforms_pipeline.py in apply(self, model, candidate_layers, layer_metadata)
     98       transformed_model, layer_metadata = _apply_availables(
     99           model, configs, available_transforms, candidate_layers,
--> 100           layer_metadata)
    101 
    102     # Cross Layer Equalization
 
/opt/vitis_ai/conda/envs/vitis-ai-tensorflow2/lib/python3.7/site-packages/tensorflow_model_optimization/python/core/quantization/keras/vitis/quantize_strategy/pof2s/vitis_pof2s_transforms_pipeline.py in _apply_availables(model, configs, available_transforms, candidate_layers, layer_metadata)
     52   transformed_model, layer_metadata = model_transformer.ModelTransformer(
     53       model, transforms, candidate_layers,
---> 54       layer_metadata).recursive_transform()
     55   return transformed_model, layer_metadata
     56 
 
/opt/vitis_ai/conda/envs/vitis-ai-tensorflow2/lib/python3.7/site-packages/tensorflow_model_optimization/python/core/quantization/keras/vitis/graph_transformations/model_transformer.py in recursive_transform(self)
    738     return ModelTransformer(transformed_model, self.transforms,
    739                             self.candidate_layers,
--> 740                             self.layer_metadata).transform()
 
/opt/vitis_ai/conda/envs/vitis-ai-tensorflow2/lib/python3.7/site-packages/tensorflow_model_optimization/python/core/quantization/keras/vitis/graph_transformations/model_transformer.py in transform(self)
    696     # Reconstruct model from the config, using the cloned layers.
    697     if self._is_functional_model(self.model):
--> 698       transformed_model = keras.Model.from_config(self._config, custom_objects)
    699     else:
    700       transformed_model = keras.Sequential.from_config(self._config,
 
/opt/vitis_ai/conda/envs/vitis-ai-tensorflow2/lib/python3.7/site-packages/keras/engine/training.py in from_config(cls, config, custom_objects)
   2639     with generic_utils.SharedObjectLoadingScope():
   2640       input_tensors, output_tensors, created_layers = (
-> 2641           functional.reconstruct_from_config(config, custom_objects))
   2642       # Initialize a model belonging to `cls`, which can be user-defined or
   2643       # `Functional`.
 
/opt/vitis_ai/conda/envs/vitis-ai-tensorflow2/lib/python3.7/site-packages/keras/engine/functional.py in reconstruct_from_config(config, custom_objects, created_layers)
   1336         while layer_nodes:
   1337           node_data = layer_nodes[0]
-> 1338           if process_node(layer, node_data):
   1339             layer_nodes.pop(0)
   1340           else:
 
/opt/vitis_ai/conda/envs/vitis-ai-tensorflow2/lib/python3.7/site-packages/keras/engine/functional.py in process_node(layer, node_data)
   1254         kwargs = input_data[3]
   1255         try:
-> 1256           kwargs = _deserialize_keras_tensors(kwargs, created_layers)
   1257         except IndexError:
   1258           # Happens if keras tensors in kwargs are still unprocessed
 
/opt/vitis_ai/conda/envs/vitis-ai-tensorflow2/lib/python3.7/site-packages/keras/engine/functional.py in _deserialize_keras_tensors(kwargs, layer_map)
   1227 
   1228     kwargs = tf_utils.convert_inner_node_data(kwargs, wrap=True)
-> 1229     return tf.nest.map_structure(_deserialize_keras_tensor, kwargs)
   1230 
   1231   def process_node(layer, node_data):
 
/opt/vitis_ai/conda/envs/vitis-ai-tensorflow2/lib/python3.7/site-packages/tensorflow/python/util/nest.py in map_structure(func, *structure, **kwargs)
    912 
    913   return pack_sequence_as(
--> 914       structure[0], [func(*x) for x in entries],
    915       expand_composites=expand_composites)
    916 
 
/opt/vitis_ai/conda/envs/vitis-ai-tensorflow2/lib/python3.7/site-packages/tensorflow/python/util/nest.py in <listcomp>(.0)
    912 
    913   return pack_sequence_as(
--> 914       structure[0], [func(*x) for x in entries],
    915       expand_composites=expand_composites)
    916 
 
/opt/vitis_ai/conda/envs/vitis-ai-tensorflow2/lib/python3.7/site-packages/keras/engine/functional.py in _deserialize_keras_tensor(t)
   1210         tensor_index = t[2]
   1211 
-> 1212         layer = layer_map[layer_name]
   1213         new_node_index = get_node_index(layer, node_index)
   1214         if new_node_index is None:
 
KeyError: 'batch_normalization_27'

Checking the model for the batch_normalization_27 layer variables do exist including the gamma, beta, moving avg and variance. (Concatenating for ease)

In layer  batch_normalization_27/gamma:0  the content is:  <tf.Variable 'batch_normalization_27/gamma:0' shape=(128,) dtype=float32, numpy=
array([1.0391743 , 1.0215051 , 1.0527023 , 1.1652509 , 1.0169231 ,
       1.030983  , 1.1270113 , 1.0446112 , 1.0743241 , 1.0292431 ,
       1.0154777 , 1.0637294 , 1.1854355 , 1.0117522 , 1.0395036 ,
       1.0309842 , 1.10472, ....], dtype=float32)>

My current suspicion is on how I am loading the model to begin with, since I am not defining the custom layer objects, however, the inspector does see other batch normalization layers but not the layer 27 specifically, which is odd.

I understand this is a Vitis specific question, but the problem happens on TensorFlows end. Any ideas are highly appreciated!

0

There are 0 best solutions below