"No corresponding model for provided filename, make sure to specify a valid model_type" error in gpt4all

685 Views Asked by At

I am trying to run a gpt4all model through the python gpt4all library and host it online. According to the documentation, my formatting is correct as I have specified the path, model name and downloaded the actual model on my machine.

My code:

from gpt4all import GPT4All

model = GPT4All("orca-mini-3b.ggmlv3.q4_0.bin",model_path="C:/Users/mhaba/AppData/Local/nomic.ai/GPT4All/", allow_download=False)

This is the error I keep getting:

PS C:\Users\mhaba\Downloads\deliver> & "C:/Program Files (x86)/Microsoft Visual Studio/Shared/Python39_64/python.exe" c:/Users/mhaba/Downloads/deliver/app/test.py
Traceback (most recent call last):
  File "c:\Users\mhaba\Downloads\deliver\app\test.py", line 3, in <module>
    model = GPT4All("orca-mini-3b.ggmlv3.q4_0",model_path="C:/Users/mhaba/AppData/Local/nomic.ai/GPT4All/", allow_download=False)
  File "C:\Users\mhaba\AppData\Roaming\Python\Python39\site-packages\gpt4all\gpt4all.py", line 45, in __init__
    self.model = GPT4All.get_model_from_name(model_name)
  File "C:\Users\mhaba\AppData\Roaming\Python\Python39\site-packages\gpt4all\gpt4all.py", line 319, in get_model_from_name
    raise ValueError(err_msg)
ValueError: No corresponding model for provided filename orca-mini-3b.ggmlv3.q4_0.bin.
            If this is a custom model, make sure to specify a valid model_type.

I am not sure what’s causing the error.

1

There are 1 best solutions below

0
On

It seems you don't have the model downloaded in the path model_path="C:/Users/mhaba/AppData/Local/nomic.ai/GPT4All/", you provided. So when the GPT4All is trying to fetch the model it's not finding anything in your mentioned directory. You either have to change allow_download=True to that path and the GPT4ALL method will download the model in that path. Or you can specify a new path where you've already downloaded the model. Here's how you can do it:

from gpt4all import GPT4All
path = "where you want your model to be downloaded"
model = GPT4All("orca-mini-3b.ggmlv3.q4_0.bin", model_path=path, allow_download=True)

Once you have downloaded the model, from next time set allow_downlaod=False. In your current code, the method can't find any previously downloaded model. You can also do, os.getcwd() to download the model in your current directory.