gemma2b-summarize-gemini1.5flash
Collection
9 items
•
Updated
This model is a fine-tuned version of google/gemma-2b on the llama-duo/synth_summarize_dataset_dedup dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
2.7544 | 0.9630 | 13 | 2.8722 |
1.7723 | 2.0 | 27 | 2.6064 |
1.4023 | 2.9630 | 40 | 2.5710 |
1.2778 | 4.0 | 54 | 2.5349 |
1.1848 | 4.9630 | 67 | 2.5176 |
1.1522 | 6.0 | 81 | 2.5045 |
1.1305 | 6.9630 | 94 | 2.5065 |
1.1075 | 8.0 | 108 | 2.5136 |
1.1049 | 8.9630 | 121 | 2.5129 |
1.1048 | 9.6296 | 130 | 2.5133 |
Base model
google/gemma-2b