OSError: roborovksi/superprompt-v1 is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'

#1
by Softology - opened

When I run the example code

from transformers import T5Tokenizer, T5ForConditionalGeneration

tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-small")
model = T5ForConditionalGeneration.from_pretrained("roborovksi/superprompt-v1", device_map="auto")

input_text = "Expand the following prompt to add more detail: A storefront with 'Text to Image' written on it."
input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")

outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0]))

I get this error

Traceback (most recent call last):
File "D:\Tests\SuperPrompt\superprompt.py", line 4, in
model = T5ForConditionalGeneration.from_pretrained("roborovksi/superprompt-v1", device_map="auto")
File "D:\Tests\SuperPrompt\voc_superprompt\lib\site-packages\transformers\modeling_utils.py", line 2874, in from_pretrained
resolved_config_file = cached_file(
File "D:\Tests\SuperPrompt\voc_superprompt\lib\site-packages\transformers\utils\hub.py", line 421, in cached_file
raise EnvironmentError(
OSError: roborovksi/superprompt-v1 is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
If this is a private repository, make sure to pass a token having permission to this repo either by logging in with huggingface-cli login or by passing token=<your_token>

Does it need the HF token set? It is accessible without logging in, and even setting HF_TOKEN to my token gives the same error.
Any ideas?

there's a typo in the hugginface repo name in the example.
replace it with the right one...

roborovski/superprompt-v1

Thanks, that worked. I should have seen that.

It seems to be a parameter for generate

output = model.generate(tokenizer(input_text, return_tensors="pt").input_ids.to("mps"), max_new_tokens=50)

seems to be doing the trick (note 50 is just a number I picked, other values may be better)

These have been fixed, thanks for raising

roborovski changed discussion status to closed

Sign up or log in to comment