unable to load model following instructions
Traceback (most recent call last):
File “/home/user/oobabooga_linux/text-generation-webui/modules/ui_model_menu.py”, line 179, in load_model_wrapper
shared.model, shared.tokenizer = load_model(shared.model_name, loader)
File “/home/user/oobabooga_linux/text-generation-webui/modules/models.py”, line 78, in load_model
output = load_func_maploader
File “/home/user/oobabooga_linux/text-generation-webui/modules/models.py”, line 287, in AutoGPTQ_loader
return modules.AutoGPTQ_loader.load_quantized(model_name)
File “/home/user/oobabooga_linux/text-generation-webui/modules/AutoGPTQ_loader.py”, line 56, in load_quantized
model = AutoGPTQForCausalLM.from_quantized(path_to_model, **params)
File “/home/user/oobabooga_linux/installer_files/env/lib/python3.10/site-packages/auto_gptq/modeling/auto.py”, line 94, in from_quantized
return quant_func(
File “/home/user/oobabooga_linux/installer_files/env/lib/python3.10/site-packages/auto_gptq/modeling/_base.py”, line 793, in from_quantized
accelerate.utils.modeling.load_checkpoint_in_model(
File “/home/user/oobabooga_linux/installer_files/env/lib/python3.10/site-packages/accelerate/utils/modeling.py”, line 1279, in load_checkpoint_in_model
checkpoint = load_state_dict(checkpoint_file, device_map=device_map)
File “/home/user/oobabooga_linux/installer_files/env/lib/python3.10/site-packages/accelerate/utils/modeling.py”, line 1088, in load_state_dict
with safe_open(checkpoint_file, framework="pt") as f:
safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge
M5300