How to build custom pipeline in GCP using Vertex AI

604 Views Asked by At

I was exploring the vertex AI AutoML feature in GCP, which lets users import datasets, train, deploy and predict ML models. My use case is to do the data pre-processing on my own (I didn't get satisfied with AutoML data preprocessing) and want to feed that data directly to a pipeline where it trains and deploys the model. Also, I want to feed the new data to the dataset. It should take care of the entire pipeline (from data preprocessing to deploying the latest model). I want insight as to how to approach this problem?

1

There are 1 best solutions below

0
On

You can create a custom pipeline using Kubeflow Pipelines SDK v1.8.9 or higher or TensorFlow Extended v0.30.0 or higher.

  • If you use TensorFlow in an ML workflow that processes terabytes of structured data or text data, it is recommended that you build your pipeline using TFX.

  • For other use cases, we recommend that you build your pipeline using the Kubeflow Pipelines SDK. By building a pipeline with the Kubeflow Pipelines SDK, you can implement your workflow by building custom components or reusing pre-built components.

To create a Kubeflow pipeline, you can follow the next guide