Text Generation
Transformers
Safetensors
openelm
custom_code

Tokenizer related issues / errors

#4
by dakerholdings - opened

'generate_openelm.py' comments are claiming "Args: ... tokenizer: Tokenizer instance. If model is set as a string path, the tokenizer will be loaded from the checkpoint.", however, the code does not read that way, saying "tokenizer: Union[str, AutoTokenizer] 'meta-llama/Llama-2-7b-hf'" (and using that leads to "...Access to model meta-llama/Llama-2-7b-hf is restricted and you are not in the authorized list..." errors), but won't accept that parameter in the command line, and if I explicitly change it in the code to "'apple/OpenELM-XXX'", I get an error:

ValueError: Unrecognized configuration class <class 'transformers_modules.apple.OpenELM-270M.0a49ab455190e17fcf02d2f21d15e58f6496c8d9.configuration_openelm.OpenELMConfig'> to build an AutoTokenizer.

See https://github.com/huggingface/transformers/issues/30493

This comment has been hidden

Sign up or log in to comment