amaye15's picture
End of training
757ff04 verified
metadata
license: apache-2.0
base_model: google/siglip-base-patch16-224
tags:
  - generated_from_trainer
datasets:
  - stanford-dogs
metrics:
  - accuracy
  - f1
  - precision
  - recall
model-index:
  - name: google-siglip-base-patch16-224-batch64-lr5e-05-standford-dogs
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: stanford-dogs
          type: stanford-dogs
          config: default
          split: full
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.8364917395529641
          - name: F1
            type: f1
            value: 0.8328749982143954
          - name: Precision
            type: precision
            value: 0.8377481660081763
          - name: Recall
            type: recall
            value: 0.8330663170433035

google-siglip-base-patch16-224-batch64-lr5e-05-standford-dogs

This model is a fine-tuned version of google/siglip-base-patch16-224 on the stanford-dogs dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5612
  • Accuracy: 0.8365
  • F1: 0.8329
  • Precision: 0.8377
  • Recall: 0.8331

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 1000

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
4.822 0.1550 10 4.2549 0.0782 0.0493 0.0987 0.0726
4.236 0.3101 20 3.5279 0.1907 0.1507 0.2201 0.1830
3.5066 0.4651 30 2.5316 0.3319 0.2941 0.4180 0.3205
2.8064 0.6202 40 2.1243 0.4361 0.4090 0.5324 0.4282
2.441 0.7752 50 1.5798 0.5510 0.5250 0.6242 0.5438
2.0985 0.9302 60 1.4242 0.5843 0.5577 0.6400 0.5768
1.8689 1.0853 70 1.1481 0.6625 0.6456 0.7143 0.6565
1.6588 1.2403 80 1.1937 0.6465 0.6361 0.7062 0.6439
1.5807 1.3953 90 0.9818 0.7058 0.6890 0.7438 0.6981
1.4851 1.5504 100 1.0181 0.7000 0.6839 0.7373 0.6959
1.5033 1.7054 110 1.0169 0.6914 0.6845 0.7490 0.6883
1.3022 1.8605 120 0.9087 0.7276 0.7170 0.7643 0.7222
1.3106 2.0155 130 0.8385 0.7432 0.7352 0.7667 0.7363
1.1721 2.1705 140 0.8957 0.7128 0.7026 0.7592 0.7075
1.131 2.3256 150 0.8730 0.7259 0.7149 0.7687 0.7196
1.1223 2.4806 160 0.8132 0.7546 0.7457 0.7855 0.7482
1.0688 2.6357 170 0.7485 0.7704 0.7601 0.7863 0.7631
1.0686 2.7907 180 0.7559 0.7651 0.7587 0.7920 0.7609
0.9733 2.9457 190 0.7779 0.7553 0.7458 0.7797 0.7521
0.9287 3.1008 200 0.7048 0.7818 0.7756 0.7981 0.7756
0.8746 3.2558 210 0.6848 0.7867 0.7774 0.8034 0.7822
0.7982 3.4109 220 0.6930 0.7884 0.7796 0.8025 0.7846
0.823 3.5659 230 0.7041 0.7804 0.7717 0.7975 0.7752
0.8713 3.7209 240 0.7418 0.7755 0.7646 0.8053 0.7711
0.8651 3.8760 250 0.6847 0.7828 0.7773 0.8048 0.7782
0.784 4.0310 260 0.6662 0.7923 0.7841 0.8097 0.7860
0.6894 4.1860 270 0.6980 0.7843 0.7781 0.8024 0.7779
0.7727 4.3411 280 0.6629 0.7833 0.7804 0.8030 0.7798
0.6978 4.4961 290 0.6820 0.7845 0.7800 0.8011 0.7820
0.7032 4.6512 300 0.6148 0.8032 0.7969 0.8094 0.7985
0.6978 4.8062 310 0.6457 0.7940 0.7872 0.8085 0.7892
0.66 4.9612 320 0.6242 0.8088 0.8033 0.8246 0.8058
0.5706 5.1163 330 0.6404 0.7966 0.7905 0.8097 0.7928
0.5456 5.2713 340 0.7147 0.7872 0.7767 0.8060 0.7819
0.5869 5.4264 350 0.6267 0.8066 0.8016 0.8188 0.8025
0.6022 5.5814 360 0.6197 0.8061 0.8028 0.8209 0.8027
0.5676 5.7364 370 0.6061 0.8059 0.8005 0.8140 0.8024
0.5456 5.8915 380 0.6018 0.8069 0.8006 0.8254 0.8033
0.56 6.0465 390 0.6126 0.8090 0.8037 0.8206 0.8045
0.4582 6.2016 400 0.6122 0.8115 0.8062 0.8196 0.8061
0.4594 6.3566 410 0.6058 0.8122 0.8081 0.8235 0.8082
0.4868 6.5116 420 0.5890 0.8195 0.8131 0.8300 0.8141
0.4841 6.6667 430 0.5909 0.8175 0.8119 0.8250 0.8133
0.4537 6.8217 440 0.5889 0.8195 0.8153 0.8261 0.8164
0.4807 6.9767 450 0.6105 0.8144 0.8104 0.8300 0.8106
0.4051 7.1318 460 0.5917 0.8171 0.8103 0.8217 0.8131
0.3727 7.2868 470 0.6037 0.8166 0.8116 0.8262 0.8125
0.4034 7.4419 480 0.6407 0.8032 0.8003 0.8146 0.8015
0.3684 7.5969 490 0.6205 0.8061 0.7997 0.8176 0.8008
0.416 7.7519 500 0.5855 0.8258 0.8207 0.8364 0.8211
0.3947 7.9070 510 0.5802 0.8214 0.8179 0.8283 0.8179
0.3731 8.0620 520 0.5870 0.8239 0.8191 0.8324 0.8188
0.3203 8.2171 530 0.5783 0.8265 0.8211 0.8302 0.8216
0.337 8.3721 540 0.5836 0.8200 0.8162 0.8247 0.8166
0.3396 8.5271 550 0.5992 0.8156 0.8121 0.8253 0.8115
0.3355 8.6822 560 0.5755 0.8229 0.8182 0.8281 0.8187
0.3273 8.8372 570 0.5819 0.8246 0.8194 0.8268 0.8208
0.3181 8.9922 580 0.5840 0.8205 0.8174 0.8279 0.8168
0.2855 9.1473 590 0.5997 0.8144 0.8098 0.8213 0.8103
0.254 9.3023 600 0.5863 0.8183 0.8132 0.8251 0.8133
0.2781 9.4574 610 0.5779 0.8224 0.8169 0.8275 0.8195
0.2691 9.6124 620 0.5816 0.8219 0.8177 0.8257 0.8186
0.3018 9.7674 630 0.5814 0.8297 0.8250 0.8370 0.8253
0.2615 9.9225 640 0.5761 0.8299 0.8261 0.8377 0.8262
0.2707 10.0775 650 0.5640 0.8326 0.8283 0.8385 0.8284
0.2482 10.2326 660 0.5685 0.8246 0.8206 0.8284 0.8218
0.2493 10.3876 670 0.5717 0.8241 0.8208 0.8311 0.8199
0.2167 10.5426 680 0.5741 0.8246 0.8204 0.8273 0.8204
0.2628 10.6977 690 0.5791 0.8248 0.8205 0.8281 0.8216
0.2316 10.8527 700 0.5770 0.8321 0.8272 0.8348 0.8284
0.2326 11.0078 710 0.5755 0.8280 0.8249 0.8348 0.8249
0.2001 11.1628 720 0.5783 0.8336 0.8299 0.8354 0.8310
0.1759 11.3178 730 0.5804 0.8345 0.8302 0.8367 0.8311
0.202 11.4729 740 0.5820 0.8316 0.8278 0.8353 0.8280
0.2191 11.6279 750 0.5724 0.8324 0.8279 0.8341 0.8287
0.1955 11.7829 760 0.5957 0.8226 0.8181 0.8268 0.8198
0.1972 11.9380 770 0.5722 0.8294 0.8254 0.8318 0.8263
0.1848 12.0930 780 0.5731 0.8311 0.8269 0.8339 0.8281
0.1613 12.2481 790 0.5682 0.8382 0.8344 0.8397 0.8356
0.1665 12.4031 800 0.5565 0.8350 0.8325 0.8365 0.8325
0.1739 12.5581 810 0.5738 0.8360 0.8328 0.8395 0.8326
0.1744 12.7132 820 0.5628 0.8360 0.8327 0.8387 0.8328
0.1737 12.8682 830 0.5712 0.8355 0.8320 0.8395 0.8324
0.1635 13.0233 840 0.5745 0.8309 0.8256 0.8328 0.8269
0.1689 13.1783 850 0.5781 0.8326 0.8288 0.8358 0.8294
0.1611 13.3333 860 0.5740 0.8328 0.8280 0.8349 0.8289
0.1624 13.4884 870 0.5656 0.8324 0.8279 0.8328 0.8287
0.1635 13.6434 880 0.5618 0.8319 0.8276 0.8328 0.8280
0.1395 13.7984 890 0.5648 0.8350 0.8311 0.8368 0.8312
0.1489 13.9535 900 0.5666 0.8341 0.8304 0.8370 0.8304
0.1174 14.1085 910 0.5700 0.8358 0.8321 0.8400 0.8320
0.1274 14.2636 920 0.5720 0.8331 0.8295 0.8366 0.8295
0.134 14.4186 930 0.5657 0.8353 0.8311 0.8369 0.8317
0.1327 14.5736 940 0.5662 0.8343 0.8308 0.8367 0.8307
0.1165 14.7287 950 0.5654 0.8341 0.8301 0.8355 0.8303
0.1277 14.8837 960 0.5661 0.8345 0.8308 0.8360 0.8310
0.1221 15.0388 970 0.5615 0.8370 0.8335 0.8388 0.8335
0.1194 15.1938 980 0.5632 0.8353 0.8318 0.8369 0.8319
0.1126 15.3488 990 0.5616 0.8362 0.8326 0.8376 0.8327
0.1256 15.5039 1000 0.5612 0.8365 0.8329 0.8377 0.8331

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.3.0
  • Datasets 2.19.1
  • Tokenizers 0.19.1