I want to deploy a model with tensorflow serving. In the model, there are some ops that can't place into gpu. While deploy my model, I should config the tensorflow serving config file platform_config_file
. The most important is allow_soft_placement
. How to write it?
platform_configs: {
key: "tensorflow",
value: {
source_adapter_config: {
type_url: "type.googleapis.com/tensorflow.serving.SavedModelBundleSourceAdapterConfig"
value:
And, it use any.proto
, the value must be bytes.How to write this.
thanks very much.