Pipeline not working
#3
by
lucabot
- opened
Hi, when I try to use the model through the pipeline I get this error:
ValueError: Unrecognized configuration class <class 'transformers_modules.nikravan.glm-4vq.e441477369dc88ad0ab225d9cd69db0291e2dc7b.configuration_chatglm.ChatGLMConfig'> for this kind of AutoModel: AutoModelForDocumentQuestionAnswering.
Model type should be one of LayoutLMConfig, LayoutLMv2Config, LayoutLMv3Config.
Any idea what it could be?
Also, is there any way to run this locally loading the model directly on an 8GB VRAM GPU? I tried llm_int8_enable_fp32_cpu_offload = true but it throws:
ValueError: Blockwise quantization only supports 16/32-bit floats, but got torch.uint8
Thanks in advance.