NotImplementedError while converting a tensorflow model to coreml using coremltools

144 Views Asked by At

I am facing a NotImplementedError while trying to convert MoViNets model to coreml.

The model is saved in SavedModel format and I am using a tensorflow version equal to 2.13.0 and a version of Core ML Tools equal to 6.3.0 and a 3.10.6 python.

I don't understand the origin of StatefulPartitionedCall operation and its meaning, any idea what's could be going on?

Here are the steps to reproduce the error:

  1. Download a savedModel from the following link.
  2. Convert the model using:
import coremltools as ct
coreml_model = ct.convert(saved_dir, convert_to="mlprogram")

where saved_dir is the path to the downloaded model.

The error:

NotImplementedError: Conversion for TF op 'StatefulPartitionedCall' not implemented.
 
name: "StatefulPartitionedCall"
op: "StatefulPartitionedCall"
input: "image"
input: "unknown"
input: "unknown_0"
input: "unknown_1"
input: "unknown_2"
input: "unknown_3"
input: "unknown_4"
input: "unknown_5"
input: "unknown_6"
input: "unknown_7"
input: "unknown_8"
input: "unknown_9"
input: "unknown_10"
input: "unknown_11"
input: "unknown_12"
input: "unknown_13"
input: "unknown_14"
input: "unknown_15"
...
input: "unknown_597"
input: "unknown_598"
input: "unknown_599"
attr {
  key: "Tin"
  value {
    list {
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
      type: DT_FLOAT
    }
  }
}
attr {
  key: "Tout"
  value {
    list {
      type: DT_FLOAT
    }
  }
}
attr {
  key: "_XlaMustCompile"
  value {
    b: true
  }
}
attr {
  key: "_collective_manager_ids"
  value {
    list {
    }
  }
}
attr {
  key: "_has_manual_control_dependencies"
  value {
    b: true
  }
}
attr {
  key: "_read_only_resource_inputs"
  value {
    list {
      i: 1
      i: 2
      i: 3
      i: 4
      i: 5
      i: 6
      i: 7
      i: 8
      i: 9
      i: 10
      i: 11
      i: 12
      i: 13
      i: 14
      i: 15
...
      i: 597
      i: 598
      i: 599
      i: 600
      i: 601
    }
  }
}
attr {
  key: "config"
  value {
    s: ""
  }
}
attr {
  key: "config_proto"
  value {
    s: "\n\007\n\003CPU\020\001\n\007\n\003GPU\020\0012\005*\0010J\0008\001\202\001\000"
  }
}
attr {
  key: "executor_type"
  value {
    s: ""
  }
}
attr {
  key: "f"
  value {
    func {
      name: "__inference_predict_frozen_288748"
    }
  }
}

As TensorFlow version 2.13.0 has not been tested with coremltools, I tested the conversion with tensorflow 2.12.0 but the error did persist.

0

There are 0 best solutions below