Failed to create LLM 'stablelm' GGUF ctransformers Google Colab

531 Views Asked by At

I often get error from ctransformers when importing some GGUF models with model_type = stablelm in Google Colab. Here is the code example

from ctransformers import AutoModelForCausalLM

llm = AutoModelForCausalLM.from_pretrained("TheBloke/Marx-3B-v3-GGUF", model_file="marx-3b-v3.Q3_K_S.gguf", model_type='stablelm')
print(llm("AI is going to"))

It actually downloaded the .gguf file from repo, but returned this error message

RuntimeError: Failed to create LLM 'stablelm' from '/root/.cache/huggingface/hub/models--TheBloke--Marx-3B-v3-GGUF/blobs/fb16032da1b4f68d465cb7b2164c5305be8a008657ed0cd6cbb91b3a94b032ee'.
1

There are 1 best solutions below

0
thangaraj1980 On
from ctransformers import AutoModelForCausalLM, AutoConfig, Config

Hi I tried with below option with mistral but i am getting below error

RuntimeError: Failed to create LLM 'mistral' from 'C:\LLM\Mistral\mistral-7b-instruct-v0.2.Q4_K_M.gguf'. Error.

Any help?

    conf = AutoConfig(Config(temperature=0.25,
                             max_new_tokens=512))

    model = AutoModelForCausalLM.from_pretrained("C:\LLM\Mistral\mistral-7b-instruct-v0.2.Q4_K_M.gguf",
                                                     model_file="TheBloke/Mistral-7B-Instruct-v0.2-GGUF",
                                                     config = conf,
                                                     device_map="cpu",
                                                     model_type="mistral")


    model = AutoModelForCausalLM.from_pretrained(TheBloke/Mistral-7B-Instruct-v0.2-GGUF,                                                     model_file="C:\LLM\Mistral\mistral-7b-instruct-v0.2.Q4_K_M.gguf",
                                                     config = conf,
                                                     device_map="cpu",
                                                     model_type="mistral")

    model = AutoModelForCausalLM.from_pretrained("C:\LLM\Mistral\mistral-7b-instruct-v0.2.Q4_K_M.gguf",
                                                                                                         config = conf,
                                                     device_map="cpu",
                                                     model_type="mistral")


   model = AutoModelForCausalLM.from_pretrained(model_path_or_repo_id="C:\LLM\Mistral\mistral-7b-instruct-v0.2.Q4_K_M.gguf",
                                                 
                                                 model_type="mistral",
                                                 config = conf,
                                                 device_map="cpu",
                                                 local_files_only = True
                                                 )