Finetuning Gemma-2b-it Models to answer European travel-related Q&A
This is a sample project where we fine-tuned the Gemma-2b-it model on Wikivoyage travel data on European cities to see if it answers the travel related Q&A better. This is also an ongoing WIP project.
Training Specs
- Gemma-2b fine-tuned on 2 Nvidia A40 GPUs using HuggingFace SFTrainer
- 10 epochs
- 1e-5 learning rate
Data
- Wikivoyage (English) data for 160 European cities.
- Used Gemini 1.5 Pro and Flash to extract relevant questions from the data.
- Downloads last month
- 10