Error on model load

#5
by warhol-AC - opened

I am sure I am missing something obvious but I get the following error when trying to load in oobabooga:

Traceback (most recent call last):
File “D:\new_ooba\oobabooga_windows\text-generation-webui\server.py”, line 102, in load_model_wrapper
shared.model, shared.tokenizer = load_model(shared.model_name)
File “D:\new_ooba\oobabooga_windows\text-generation-webui\modules\models.py”, line 217, in load_model
model = LoaderClass.from_pretrained(checkpoint, **params)
File “D:\new_ooba\oobabooga_windows\installer_files\env\lib\site-packages\transformers\models\auto\auto_factory.py”, line 471, in from_pretrained
return model_class.from_pretrained(
File “D:\new_ooba\oobabooga_windows\installer_files\env\lib\site-packages\transformers\modeling_utils.py”, line 2405, in from_pretrained
raise EnvironmentError(
OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory models\TehVenom_Pygmalion-7b-4bit-GPTQ-Safetensors.

Your ooba is not looking for a 4bit model, make sure to read the documentation to pass the proper arguments / know the proper file renaming scheme they follow so that it starts looking for the right files.

Sign up or log in to comment