Inference API not working
Hi, when trying to test the model using Inference API, the following error is shown:
Can't load tokenizer using from_pretrained, please update its configuration: Can't load tokenizer for 'utter-project/mHuBERT-147'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'utter-project/mHuBERT-147' is the correct path to a directory containing all relevant files for a Wav2Vec2CTCTokenizer tokenizer.
Has anyone encountered this issue before? Any guidance on how to resolve this error would be greatly appreciated.
Thank you!
Hi. Thanks for the interest in the model.
This is a checkpoint for a speech representation model, not an ASR system ready to use. You need to load it and fine-tune it before you can run ASR inference.
Best,