Taiwan-LLM-7B-v2.0.1-chat-4bits-GPTQ
- Model creator: Yen-Ting Lin
- Original model: Taiwan LLM based on LLaMa2-7b v2.0.1 chat
Description
This repo contains GPTQ model files for Yen-Ting Lin's Language Models for Taiwan LLM based on LLaMa2-7b v2.0.1 chat.
Original model card: Yen-Ting Lin's Language Models for Taiwan LLM based on LLaMa2-7b
Taiwan LLM based on LLaMa2-7b
continue pretraining on 20 billion tokens in traditional mandarin and instruction fine-tuning on millions of conversations.
This version does NOT include commoncrawl.
🌟 Checkout New Taiwan-LLM Demo Chat-UI 🌟
Collaboration with Ubitus K.K. 💪💪💪
本項目與 Ubitus K.K. 合作進行。Ubitus 為本項目提供寶貴的技術支持和計算資源。
Taiwan LLM v2 is conducted in collaboration with Ubitus K.K.. Ubitus provides valuable technical support and compute resources for the project.
- Downloads last month
- 17
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.