Update chat template and eos_token configs
#9
by
ykhwang
- opened
No description provided.
References
- EOS token fixes
- Updated chat template
ykhwang
changed pull request status to
open
This PR needs to be made to fix issues with special token handling when using the chat completion endpoints. Similar needs to be done with the other instruct models.
Thanks for the fixes @ykhwang , would you mind applying those to the rest of the affected quants? Otherwise we can do that on our end, so whatever is more suitable for you! Thanks again 🤗
alvarobartt
changed pull request status to
merged
Hello @alvarobartt , sorry for the late reply. I found that additional updates on chat templates for tool usage (https://huggingface.co/meta-llama/Meta-Llama-3.1-405B-Instruct/commit/cfe126a16d4108374a6f9cda6d117c0d08b99e23) and seems already applied by the latest commit. Thanks a lot!