How to use custom operations with tensorflow server

1k Views Asked by At

What is the ideal way to have tensorflow model server recognize my custom operation?

I have a custom operation written following this guide: https://www.tensorflow.org/guide/extend/op

I'm able to use the opp by calling tf.load_op_library, but when I try and and run tensorflow_model_server

tensorflow_model_server --port=9000 \
                        --model_name=mymodel \
                        --model_base_path=/serving/mymodel

I get the following error about being unable to find my opp.

tensorflow_serving/util/retrier.cc:37] Loading servable: {name: mymodel version: 1} failed: Not found: Op type not registered 'MyOpp' in binary running on c37a4ef2d4b4.

4

There are 4 best solutions below

1
On

Did you add your op lib in BUILD file where you want to call it?

0
On

Here is a doc describing how to do that: https://www.tensorflow.org/tfx/serving/custom_op

The bottom line is that you need to rebuild tensorflow_model_server with your op linked in. tensorflow_serving/model_servers/BUILD:

SUPPORTED_TENSORFLOW_OPS = [
    ...
    "//tensorflow_serving/.../...your_op"
]
1
On

Here are the things that I wanted to do with my op: - generate python wrappers - add op too the pip package - have my operation linked to tensorflow so tensorflow-serving could execute the operation

I placed my op in tensorflow/contrib/foo. Here is what the source tree looked like

.
├── BUILD
├── LICENSE
├── __init__.py
├── foo_op.cc
├── foo_op_gpu.cu.cc
└── foo_op.h

My __init__.py file had the import for the generated wrappers

from tensorflow.contrib.sampling.ops.gen_foo import *

I added an import in the tensorflow/contrib/__init__.py

from tensorflow.contrib import foo

Here is my tensorflow/contrib/foo/BUILD file:

licenses(["notice"])  # Apache 2.0
exports_files(["LICENSE"])

package(default_visibility = ["//visibility:public"])

load("//tensorflow:tensorflow.bzl", "tf_custom_op_py_library")
load("//tensorflow:tensorflow.bzl", "tf_gen_op_libs")
load("//tensorflow:tensorflow.bzl", "tf_gen_op_wrapper_py")
load("//tensorflow:tensorflow.bzl", "tf_kernel_library")

tf_kernel_library(
    name = "foo_op_kernels",
    prefix = "foo",
    alwayslink = 1,
)
tf_gen_op_libs(
    op_lib_names = ["foo"],
)
tf_gen_op_wrapper_py(
    name = "foo",
    visibility = ["//visibility:public"],
    deps = [
        ":foo_op_kernels",
    ],
)
tf_custom_op_py_library(
    name = "foo_py",
    srcs = [
        "__init__.py",
    ],
    kernels = [
        ":foo_op_kernels",
    ],
    srcs_version = "PY2AND3",
    deps = [
        ":foo",
        "//tensorflow/contrib/util:util_py",
        "//tensorflow/python:common_shapes",
        "//tensorflow/python:framework_for_generated_wrappers",
        "//tensorflow/python:platform",
        "//tensorflow/python:util",
    ],
)

Here are the tensorflow bazel files I had to touch to get it working.

  • tensorflow/contrib/BUILD
    • Add foo_op_kernels to contrib_kernels deps
    • Add foo_op_lib to contrib_ops_op_lib deps
    • Add foo to contrib_py deps
  • tensorflow/tools/pip_package/BUILD
    • Added my python target to COMMON_PIP_DEPS
  • tensorflow/core/BUILD
    • Added my kernels to all_kernels_statically_linked. I might have gone overboard with this one, but It worked.

Here are the tensorflow serving bazel files:

  • WORKSPACE
    • Change org_tensorflow to be a local_repository pointing to my tensorflow rather than a google's tensorflow_http_archive

Then I modified: tensorflow_serving/tools/docker/Dockerfile.devel-gpu to clone my versions of tensorflow and tensorflow-serving.

2
On

You can also use tensorflow as a submodule or local_repository to use the custom macros in the repo for your ops.