Edit model card

Mists-7B-v01-not-trained

Mists(Mistral Time Series) is a multimodal model that combines language and time series model.
This model is based on the following models:

This is an experimental model. Since the adapter has not been trained, the model is not yet suitable for use.

How to use

!pip install accelerate
from transformers import AutoProcessor, AutoModel
import torch

model_id = "HachiML/Mists-7B-v01-not-trained"

processor = AutoProcessor.from_pretrained(model_id, trust_remote_code=True)
model = AutoModel.from_pretrained(
    model_id,
    torch_dtype=torch.float32,
    low_cpu_mem_usage=True,
    device_map="auto",
    trust_remote_code=True,
)
import pandas as pd
import torch

hist_ndaq = pd.DataFrame("nasdaq_price_history.csv")
time_series_data = hist_ndaq[["Open", "High", "Low", "Close", "Volume"]].iloc[:512]

prompt = "USER: <time_series>\nWhat are the features of this data?\nASSISTANT:"
inputs = processor(prompt, time_series_data, return_tensors='pt')

device = "cuda" if torch.cuda.is_available() else "cpu"
for key, item in inputs.items():
    inputs[key] = inputs[key].to(device)

output = model.generate(**inputs, max_new_tokens=200, do_sample=False)
print(processor.decode(output[0], skip_special_tokens=False))
Downloads last month
5
Safetensors
Model size
7.62B params
Tensor type
F32
·
Inference Examples
Inference API (serverless) does not yet support model repos that contain custom code.

Model tree for HachiML/Mists-7B-v01-not-trained

Finetunes
2 models

Collection including HachiML/Mists-7B-v01-not-trained