kobkrit's picture
Update README.md
8b4ce1e
metadata
license: apache-2.0
datasets:
  - kobkrit/rd-taxqa
  - iapp_wiki_qa_squad
  - Thaweewat/alpaca-cleaned-52k-th
  - Thaweewat/instruction-wild-52k-th
  - Thaweewat/databricks-dolly-15k-th
  - Thaweewat/hc3-24k-th
  - Thaweewat/gpteacher-20k-th
  - Thaweewat/onet-m6-social
  - Thaweewat/alpaca-finance-43k-th
language:
  - th
  - en
library_name: transformers
pipeline_tag: text-generation
tags:
  - openthaigpt
  - llama

๐Ÿ‡น๐Ÿ‡ญ OpenThaiGPT 1.0.0-alpha

OpenThaiGPT Version 1.0.0-alpha is the first Thai implementation of a 7B-parameter LLaMA v2 Chat model finetuned to follow Thai translated instructions below and makes use of the Huggingface LLaMA implementation.

---- Quantized 4 bit GGML of OpenThaiGPT 1.0.0-alpha ----

Upgrade from OpenThaiGPT 0.1.0-beta

  • Using Facebook LLama v2 model 7b chat as a base model which is pretrained on over 2 trillion token.
  • Context Length is upgrade from 2048 token to 4096 token
  • Allow research and commerical use.a

Pretrain Model

Support

License

Source Code: License Apache Software License 2.0.
Weight: Research and Commercial uses.

Code and Weight

Colab Demo: https://colab.research.google.com/drive/1kDQidCtY9lDpk49i7P3JjLAcJM04lawu?usp=sharing
Finetune Code: https://github.com/OpenThaiGPT/openthaigpt-finetune-010beta
Inference Code: https://github.com/OpenThaiGPT/openthaigpt
Weight (Lora Adapter): https://huggingface.co/openthaigpt/openthaigpt-1.0.0-alpha-7b-chat
Weight (Huggingface Checkpoint): https://huggingface.co/openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf
Weight (GGML): https://huggingface.co/openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ggml
Weight (Quantized 4bit GGML): https://huggingface.co/openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ggml-q4

Sponsors

Pantip.com, ThaiSC

Powered by

OpenThaiGPT Volunteers, Artificial Intelligence Entrepreneur Association of Thailand (AIEAT), and Artificial Intelligence Association of Thailand (AIAT)

Authors

Disclaimer: Provided responses are not guaranteed.