45 MB model too big for Google AI Platform

257 Views Asked by At

I'm trying to use AI platform to deploy a scikit-learn pipeline. The size of the model.joblib file I'm tryin to deploy is 45 megabytes.

  • python version: 3.7
  • framework: scikit-learn(==0.20.4)
  • Single Core CPU, Quad Core CPU (Beta)

I've used the following command to deploy as well as GUI

 gcloud beta ai-platform versions create v0 \                                                                              
  --model test_watch_model \
  --origin gs://rohan_test_watch_model \
  --runtime-version=1.15 \
  --python-version=3.7 \
  --package-uris=gs://rohan_test_watch_model/train_custom-0.1.tar.gz \
  --framework=scikit-learn \
  --project=xxxx

This is the setup.py file I'm using, in case the problem might lie with the libraries.

from setuptools import setup

setup(
    name='train_custom',
    version='0.1',
    scripts=[
        # 'train_custom.py',
        # 'data_silo_custom.py',
        # 'dataset_custom.py',
        # 'preprocessor_custom.py'
        'all.py'
    ],
    install_requires=[
        "torch==1.5.1",
        "transformers==3.0.2",
        "farm==0.4.6"
    ]
)

I also tried removing pytorch from setup.py and using torch 1.3 from http://storage.googleapis.com/cloud-ai-pytorch/readme.txt but that leaves me with this same error message.

ERROR: (gcloud.beta.ai-platform.versions.create) Create Version failed. Bad model detected with error: Model requires more memory than allowed. Please try to decrease the model size and re-deploy. If you continue to experience errors, please contact support.
0

There are 0 best solutions below