Can't access via InferenceClient
#1
by
ucllovelab
- opened
I have a pro subscription and can query meta-llama/Llama-2-7b-chat-hf
directly using the InferenceClient like so -
client = InferenceClient(
model="meta-llama/Llama-2-7b-chat-hf",
token=API_TOKEN
)
output = client.text_generation("Can you please let us know more details about your ")
But I couldn't do the same with your finetuned model and getting -
raise BadRequestError(message, response=response) from e
huggingface_hub.utils._errors.BadRequestError: (Request ID: INIZGDhU1EFXqvABh0Bzu)
Bad request:
meta-llama/Llama-2-7b-hf is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
If this is a private repository, make sure to pass a token having permission to this repo with `use_auth_token` or log in with `huggingface-cli login` and pass `use_auth_token=True`.
Any suggestions?
Thanks!