Fix Incorrect Prompt Template defined in tokenizer_config.json

#3

The chat_template property in tokenizer_config.json currently contains the Llama-3 chat template, rather than ChatML which is what this model is actually trained with. This template is picked up by various tools and inference applications so it's beneficial that it reflects the real template the model uses.

Crystalcareai changed pull request status to merged

Sign up or log in to comment