deepseek-coder-6.7b-instruct_Fi__components_size_252_epochs_10_2024-06-21_09-35-14_3556545
This model is a fine-tuned version of deepseek-ai/deepseek-coder-6.7b-instruct on the None dataset. It achieves the following results on the evaluation set:
- Loss: 3.4511
- Accuracy: 0.479
- Chrf: 0.046
- Bleu: 0.0
- Sacrebleu: 0.0
- Rouge1: 0.063
- Rouge2: 0.032
- Rougel: 0.063
- Rougelsum: 0.063
- Meteor: 0.149
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 1
- eval_batch_size: 1
- seed: 3407
- distributed_type: multi-GPU
- num_devices: 4
- total_train_batch_size: 4
- total_eval_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-06
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 252
- training_steps: 2520
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Chrf | Bleu | Sacrebleu | Rouge1 | Rouge2 | Rougel | Rougelsum | Meteor |
---|---|---|---|---|---|---|---|---|---|---|---|---|
0.0245 | 4.0 | 252 | 1.2112 | 0.462 | 0.57 | 0.412 | 0.4 | 0.502 | 0.294 | 0.457 | 0.493 | 0.541 |
0.0673 | 8.0 | 504 | 2.4917 | 0.483 | 0.268 | 0.193 | 0.2 | 0.354 | 0.173 | 0.326 | 0.346 | 0.295 |
0.1033 | 12.0 | 756 | 3.5579 | 0.508 | 0.039 | 0.025 | 0.0 | 0.122 | 0.077 | 0.122 | 0.122 | 0.158 |
1.2756 | 16.0 | 1008 | 3.7590 | 0.481 | 0.046 | 0.0 | 0.0 | 0.169 | 0.086 | 0.169 | 0.169 | 0.148 |
0.247 | 20.0 | 1260 | 3.8850 | 0.503 | 0.037 | 0.0 | 0.0 | 0.005 | 0.004 | 0.005 | 0.005 | 0.136 |
0.5668 | 24.0 | 1512 | 3.6216 | 0.518 | 0.034 | 0.0 | 0.0 | 0.019 | 0.013 | 0.021 | 0.021 | 0.145 |
0.0462 | 28.0 | 1764 | 3.5317 | 0.517 | 0.037 | 0.0 | 0.0 | 0.008 | 0.007 | 0.008 | 0.008 | 0.15 |
0.2198 | 32.0 | 2016 | 3.5121 | 0.503 | 0.044 | 0.0 | 0.0 | 0.07 | 0.035 | 0.07 | 0.07 | 0.156 |
0.0588 | 36.0 | 2268 | 3.4731 | 0.488 | 0.046 | 0.0 | 0.0 | 0.062 | 0.031 | 0.062 | 0.062 | 0.155 |
0.0639 | 40.0 | 2520 | 3.4511 | 0.479 | 0.046 | 0.0 | 0.0 | 0.063 | 0.032 | 0.063 | 0.063 | 0.149 |
Framework versions
- Transformers 4.37.0
- Pytorch 2.2.1+cu121
- Datasets 2.20.0
- Tokenizers 0.15.2
Model tree for vdavidr/deepseek-coder-6.7b-instruct_Fi__components_size_252_epochs_10_2024-06-21_09-35-14_3556545
Base model
deepseek-ai/deepseek-coder-6.7b-instruct