3_loa
This model is a fine-tuned version of google/flan-t5-large on the billsum dataset. It achieves the following results on the evaluation set:
- Loss: 1.4825
- Rouge1: 0.201
- Rouge2: 0.1132
- Rougel: 0.1753
- Rougelsum: 0.1755
- Gen Len: 19.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
---|---|---|---|---|---|---|---|---|
2.1079 | 1.0 | 989 | 1.6673 | 0.2028 | 0.1092 | 0.1748 | 0.1751 | 19.0 |
1.8481 | 2.0 | 1978 | 1.6150 | 0.1979 | 0.1061 | 0.1715 | 0.1717 | 19.0 |
1.7889 | 3.0 | 2967 | 1.5833 | 0.1994 | 0.11 | 0.1727 | 0.1727 | 19.0 |
1.7319 | 4.0 | 3956 | 1.5584 | 0.1978 | 0.1084 | 0.1718 | 0.1718 | 19.0 |
1.7279 | 5.0 | 4945 | 1.5440 | 0.2016 | 0.1106 | 0.1755 | 0.1756 | 19.0 |
1.7386 | 6.0 | 5934 | 1.5326 | 0.1991 | 0.1086 | 0.1734 | 0.1736 | 19.0 |
1.6972 | 7.0 | 6923 | 1.5251 | 0.2013 | 0.1122 | 0.1759 | 0.176 | 19.0 |
1.6732 | 8.0 | 7912 | 1.5145 | 0.2024 | 0.1123 | 0.1766 | 0.1766 | 19.0 |
1.6597 | 9.0 | 8901 | 1.5079 | 0.2019 | 0.1125 | 0.1751 | 0.1753 | 19.0 |
1.6151 | 10.0 | 9890 | 1.5045 | 0.201 | 0.1123 | 0.1758 | 0.1761 | 19.0 |
1.6381 | 11.0 | 10879 | 1.4997 | 0.2009 | 0.1116 | 0.1755 | 0.1756 | 19.0 |
1.6148 | 12.0 | 11868 | 1.4974 | 0.2018 | 0.1133 | 0.1763 | 0.1765 | 19.0 |
1.6196 | 13.0 | 12857 | 1.4940 | 0.2014 | 0.1129 | 0.1756 | 0.1756 | 19.0 |
1.6137 | 14.0 | 13846 | 1.4914 | 0.2025 | 0.1136 | 0.1766 | 0.1768 | 19.0 |
1.6313 | 15.0 | 14835 | 1.4873 | 0.2032 | 0.114 | 0.1769 | 0.1771 | 19.0 |
1.6098 | 16.0 | 15824 | 1.4847 | 0.2012 | 0.1133 | 0.175 | 0.1754 | 19.0 |
1.6061 | 17.0 | 16813 | 1.4845 | 0.2019 | 0.1138 | 0.1752 | 0.1755 | 19.0 |
1.5918 | 18.0 | 17802 | 1.4833 | 0.2011 | 0.1129 | 0.1747 | 0.175 | 19.0 |
1.5842 | 19.0 | 18791 | 1.4824 | 0.2013 | 0.1133 | 0.1753 | 0.1755 | 19.0 |
1.5964 | 20.0 | 19780 | 1.4825 | 0.201 | 0.1132 | 0.1753 | 0.1755 | 19.0 |
Framework versions
- Transformers 4.31.0
- Pytorch 1.13.1.post200
- Datasets 2.10.0
- Tokenizers 0.13.2
Model tree for eschorn/3_loa
Base model
google/flan-t5-large