Model Card for Model ID
This PEFT weight is for predicting BTC price.
Disclaimer: This model is for a time series problem on LLM performance, and it's not for investment advice; any prediction results are not a basis for investment reference.
Model Details
Training data source: BTC/USD provided by Binance.
Model Description
This repo contains QLoRA format model files for Meta's Llama 2 7B-chat.
Uses
import torch
from peft import LoraConfig, PeftModel
from transformers import (
AutoModelForCausalLM,
AutoTokenizer,
BitsAndBytesConfig,
HfArgumentParser,
TrainingArguments,
TextStreamer,
pipeline,
logging,
)
device_map = {"": 0}
use_4bit = True
bnb_4bit_compute_dtype = "float16"
bnb_4bit_quant_type = "nf4"
use_nested_quant = False
compute_dtype = getattr(torch, bnb_4bit_compute_dtype)
bnb_config = BitsAndBytesConfig(
load_in_4bit=use_4bit,
bnb_4bit_quant_type=bnb_4bit_quant_type,
bnb_4bit_compute_dtype=compute_dtype,
bnb_4bit_use_double_quant=use_nested_quant,
)
based_model_path = "DavidLanz/Llama2-tw-7B-v2.0.1-chat"
adapter_path = "DavidLanz/llama2_7b_taiwan_btc_qlora"
base_model = AutoModelForCausalLM.from_pretrained(
based_model_path,
low_cpu_mem_usage=True,
# load_in_4bit=True,
return_dict=True,
quantization_config=bnb_config,
torch_dtype=torch.float16,
device_map=device_map,
)
model = PeftModel.from_pretrained(base_model, adapter_path)
tokenizer = AutoTokenizer.from_pretrained(base_model_path, trust_remote_code=True)
tokenizer.pad_token = tokenizer.eos_token
tokenizer.padding_side = "right"
from transformers import pipeline
pipe = pipeline("text-generation", model=model, tokenizer=tokenizer, torch_dtype=torch.bfloat16, device_map="auto")
messages = [
{
"role": "system",
"content": "你是一位專業的BTC虛擬貨幣分析預測BTC的收盤價格。",
},
{"role": "user", "content": "昨日開盤價為64437.18,最高價為64960.37,最低價為62953.90,收盤價為64808.35,交易量為808273.27。請預測今日BTC的收盤價?"},
]
prompt = pipe.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipe(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
Framework versions
- PEFT 0.10.0
- Downloads last month
- 2
Model tree for DavidLanz/llama2_7b_taiwan_btc_qlora
Base model
DavidLanz/Llama2-tw-7B-v2.0.1-chat