In Databricks, I have used mlflow and got my model served through REST API. It works fine when all model features are provided. But my use case is that only a single feature (the primary key) will be provided by the consumer application, and my code has to lookup the other features from a database based on that key and then use the model.predict to return the prediction. I tried researching but understood that the REST endpoints will simply invoke the model.predict function. How can I make it invoke a data massaging function before predicting?
How do I invoke a data enrichment function before model.predict while serving the model in Databricks
403 Views Asked by Bhawik Raja At
1
There are 1 best solutions below
Related Questions in MODEL
- Can raw means and estimated marginal means be the same ? And when?
- Can't load the saved model in PyTorch
- Question answering model for determine TRL(Technology Readiness Levels)
- Cannot trace my own model using torch.jit.trace
- Get json field value in sqlite model from view django
- Loading the pre-trained model from the .h5 file (Works on Colab but does not work on Local)
- how to get a model in js for odoo 16
- Is there a way to connect two models in mern and access user id of other model
- Using service in the constructor of a MODEL (angular)
- Beta coefficient of direct effect increases after controlling for mediator
- Running a pretrained model on real-time applications
- How to create two separate sets of data (one for daylight hours and another for nighttime hours) from hourly netcdf model output using CDO
- How to understand the Sensor Setting Property ID in the SIG Mesh model
- ValueError: Unknown layer: 'Custom>TFMPNetMainLayer'
- How to generate thumbnail images or GIFs from .GLB 3D models in Python?
Related Questions in DATABRICKS
- Generate Databricks personal access token using REST API
- Databricks Delta table / Compute job
- Problem to add service principal permissions with terraform
- Spark connectors from Azure Databricks to Snowflake using AzureAD login
- SparkException: Task failed while writing rows, caused by Futures timed out
- databricks-connect==14.3 does not recognize cluster
- Connect and track mlflow runs on databricks
- Databricks can't find a csv file inside a wheel I installed when running from a Databricks Notebook
- How to override a is_member() in-built function in databricks
- Last SPARK Task taking forever to complete
- Call Databricks API from an ASP.NET Core web application
- Access df_loaded and/or run_id in Load Data section of best trial notebook of Databricks AutoML run
- How to avoid being struct column name written to the json file?
- Understanding least common type in databricks
- Azure DataBricks - Looking to query "workflows" related logs in Log Analytics (ie Name, CreatedBy, RecentRuns, Status, StartTime, Job)
Related Questions in MLFLOW
- Nginx authentication issues when building mlflow through docker-compose
- Writing a custom predict method using MLFlow and pyspark
- How to update a previous run into MLFlow?
- Does MLflow support distributed XGBoost model?
- Permission denied writing artifacts to an NFS-mounted PVC
- How to save Tensorflow tokenized text on Mlflow for predictions?
- How to save models in MLFlow with R and get Stages of them in Azure Databricks?
- Registering models from Databricks to Azure ML and save Azure ML image into provided ACR(Non Default ACR of AML Workspace)
- I can't seem to deploy a model-serving endpoint in DataBricks from a finetuned model in DBFS
- MlflowException: Unable to download model artifacts in Databricks while registering model with MLflow
- TypeError: log_model() got an unexpected keyword argument 'task'
- How to set experiment and run's name when sharing a MLFlow Project?
- How to setup MLflow Authentication docker
- mlflow track tensorflow dense size
- client.create_model_version(MODEL_NAME, source, run_id, description) giving AttributeError: 'str' object has no attribute 'items'
Related Questions in MLOPS
- Extract current running stage from dvc
- How can I download data from just one of the DVC repositories?
- connection issues when MLFLow is hosted on remote server
- feast.errors.FeatureViewNotFoundException: Feature view driver_stats does not exist
- I have a prolem with feast[redis]
- how can save model by tensorflowlite
- Why MLFlow raising HTTP/2 stream 5 was not closed cleanly before end of the underlying stream?
- Model serving - tools and components
- Unable to properly register model and create Sagemaker Endpoint using Sagemaker Pipelines
- Can MLFlow be used without the `with mlflow.start_run()` block?
- Databricks DBX and Asset Bundles: Support for Storing config files in Container/Storage Account
- Manual Scaling of Nodes on a deployed Vertex AI endpoint
- How to deploy multiple instances PyTorch model API for inference on a single GPU?
- how to import ml model (python) into another programming language
- sagemaker batch transformer with my own pre-trained model
Related Questions in SERVING
- How to write a config file for my ensemble model using triton-inference-server
- how to serve static files and media files in c panel for django project?
- Is there a design pattern to serve data/files from a hierarchical menu so that data may exist in multiple menus without duplicating data?
- Nginx Caching Content Config
- Why doesn't NGINX apply header response when serving an image?
- Unrecognized content type parameters: format when serving model on databricks experiement
- Azure Databricks model serving mlflow error version
- Model Serving Databricks Status Failed
- ML serving using either kserve seldon or bentoml
- Output of model after serving different with keras model output
- tensorflow keras savedmodel lost inputs name and add unknow inputs
- Custom MLFlow scoring_server for model serving
- Runtime ~100X higer when return a graph with tf.function and serving
- How do I invoke a data enrichment function before model.predict while serving the model in Databricks
- Serving Static on AWS - Django - Python
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
There are two approaches for that:
You can use custom MLflow model, where you override the
predictfunction, and it will call database or other source for an additional information, and then call actualpredictof the model. You can find more information in following answers: 1, 2.Use Databricks Feature Store for your data, train & log model using the FeatureStoreClient.log_model function, then publish feature store tables into a database, and then use model via model serving, and it will automatically lookup for features.