I have a kubeflow pipeline which trains custom (i.e. not based on sklearn / tensorflow etc. classes) ml model. Now I would like to add serving at the end of the pipeline. I.e. I want to have a service in my Kubernetes cluster which uses the model to answer prediction requests and this service should be updated with a new model after each pipeline run.
As far as I know to serve a custom model I should:
- Wrap my model into kfserving.KFModel class 
- Create docker image with the wrapper from 1) running 
- Create InferenceService endpoint with image from 2) 
Is there any cloud agnostic way to do this in a Kubeflow component? (so basically the component must be able to build docker images)
Is there some better way to achieve my purpose?
Maybe I should move steps 1-3 outside of pipeline component and just create a component which would trigger external execution of 1-3. Can this be done?
 
                        
I can't speak to Kubeflow in particular, but https://buildpacks.io/ provides a general-purpose way to build containers that satisfy certain input criteria (for example, "is a python program with a
mainand arequirements.txt"). It's also possible (but more complicated) to create a new buildpack (for example, to take "python code that implementskfserving.KFModeland wrap a main and whatever else is needed around it). I've done this a few times for python for demos/etc:https://github.com/evankanderson/klr-buildpack https://github.com/evankanderson/pyfun
Note that these aren't production-grade, just me playing around for a day or three.
You can build buildpacks locally with the
packcommand, or on a cluster using several technologies. There's detailed documentation for 5 build options here: https://buildpacks.io/docs/tools/, along with a longer list of "supported platforms" at the bottom of https://buildpacks.io/features/.