Use a pipeline as a high-level helper
from transformers import pipeline
messages = [ {"role": "user", "content": "Who are you?"}, ] pipe = pipeline("text-generation", model="google/gemma-2-2b-it") pipe(messages)# Use a pipeline as a high-level helper from transformers import pipeline
messages = [ {"role": "user", "content": "Who are you?"}, ] pipe = pipeline("text-generation", model="google/gemma-2-2b-it") pipe(messages)
- Downloads last month
- 0
Inference API (serverless) does not yet support fasttext models for this pipeline type.