how to run this without LM Studio?
#2
by
Helmet331
- opened
cant get it to work.
my python code:
from transformers import AutoTokenizer, AutoModelForCausalLM
model_id = "bartowski/c4ai-command-r-08-2024-GGUF"
filename = "c4ai-command-r-08-2024-Q6_K_L.gguf"
tokenizer = AutoTokenizer.from_pretrained(model_id, gguf_file=filename)
model = AutoModelForCausalLM.from_pretrained(model_id, gguf_file=filename)
Format message with the command-r-08-2024 chat template
messages = [{"role": "user", "content": "Hello?"}]
input_ids = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt")
<|START_OF_TURN_TOKEN|><|USER_TOKEN|>Hello?<|END_OF_TURN_TOKEN|><|START_OF_TURN_TOKEN|><|CHATBOT_TOKEN|>
gen_tokens = model.generate(
input_ids,
max_new_tokens=100,
do_sample=True,
temperature=0.3,
)
gen_text = tokenizer.decode(gen_tokens[0])
print(gen_text)