Edit model card

dit-base_tobacco-tiny_tobacco3482_simkd

This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7298
  • Accuracy: 0.8
  • Brier Loss: 0.3356
  • Nll: 1.1950
  • F1 Micro: 0.8000
  • F1 Macro: 0.7677
  • Ece: 0.2868
  • Aurc: 0.0614

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 50 1.0044 0.11 0.8970 5.3755 0.11 0.0297 0.1810 0.9082
No log 2.0 100 0.9997 0.27 0.8946 5.6759 0.27 0.1038 0.2752 0.7229
No log 3.0 150 0.9946 0.345 0.8902 4.6234 0.345 0.1969 0.3377 0.6577
No log 4.0 200 0.9814 0.4 0.8686 3.0912 0.4000 0.2605 0.3687 0.3808
No log 5.0 250 0.9618 0.56 0.8277 2.9065 0.56 0.4439 0.4769 0.2239
No log 6.0 300 0.9225 0.58 0.7429 2.5647 0.58 0.4408 0.4561 0.1944
No log 7.0 350 0.8843 0.705 0.6414 2.4145 0.705 0.5531 0.4493 0.1261
No log 8.0 400 0.8627 0.685 0.5773 2.4171 0.685 0.5755 0.3710 0.1378
No log 9.0 450 0.8252 0.73 0.5158 1.6133 0.7300 0.6403 0.3706 0.1066
0.9306 10.0 500 0.8164 0.74 0.4861 1.9299 0.74 0.6672 0.3352 0.1090
0.9306 11.0 550 0.8350 0.67 0.5078 2.0291 0.67 0.6083 0.3271 0.1514
0.9306 12.0 600 0.8089 0.695 0.4680 1.6726 0.695 0.6065 0.3049 0.1040
0.9306 13.0 650 0.7847 0.78 0.4097 1.3710 0.78 0.7067 0.3090 0.0825
0.9306 14.0 700 0.7793 0.8 0.3952 1.4382 0.8000 0.7351 0.3189 0.0823
0.9306 15.0 750 0.7756 0.775 0.3979 1.2640 0.775 0.6997 0.2950 0.0835
0.9306 16.0 800 0.7888 0.765 0.3927 1.2499 0.765 0.6894 0.3175 0.0719
0.9306 17.0 850 0.7596 0.795 0.3603 1.1834 0.795 0.7250 0.2930 0.0673
0.9306 18.0 900 0.7581 0.795 0.3580 1.1902 0.795 0.7241 0.3104 0.0665
0.9306 19.0 950 0.7546 0.81 0.3547 1.1055 0.81 0.7583 0.3024 0.0621
0.7329 20.0 1000 0.7520 0.81 0.3547 1.1284 0.81 0.7533 0.3209 0.0581
0.7329 21.0 1050 0.7669 0.775 0.3906 1.3812 0.775 0.7502 0.3212 0.0794
0.7329 22.0 1100 0.7532 0.81 0.3591 1.0982 0.81 0.7836 0.3035 0.0708
0.7329 23.0 1150 0.7519 0.805 0.3643 1.0628 0.805 0.7742 0.2813 0.0732
0.7329 24.0 1200 0.7494 0.795 0.3614 1.1123 0.795 0.7618 0.2988 0.0699
0.7329 25.0 1250 0.7517 0.79 0.3696 1.0703 0.79 0.7606 0.3081 0.0800
0.7329 26.0 1300 0.7513 0.795 0.3629 1.1020 0.795 0.7769 0.2797 0.0722
0.7329 27.0 1350 0.7485 0.795 0.3552 1.0352 0.795 0.7671 0.2678 0.0684
0.7329 28.0 1400 0.7442 0.805 0.3471 1.0956 0.805 0.7706 0.2807 0.0630
0.7329 29.0 1450 0.7473 0.795 0.3592 1.1204 0.795 0.7685 0.2897 0.0722
0.6917 30.0 1500 0.7449 0.815 0.3482 1.0584 0.815 0.7862 0.2949 0.0629
0.6917 31.0 1550 0.7443 0.8 0.3512 1.1010 0.8000 0.7770 0.2954 0.0622
0.6917 32.0 1600 0.7454 0.785 0.3543 1.0994 0.785 0.7631 0.2957 0.0639
0.6917 33.0 1650 0.7421 0.815 0.3449 1.1826 0.815 0.7853 0.2996 0.0592
0.6917 34.0 1700 0.7454 0.79 0.3559 1.1000 0.79 0.7597 0.2964 0.0659
0.6917 35.0 1750 0.7418 0.815 0.3477 1.1616 0.815 0.7867 0.3133 0.0617
0.6917 36.0 1800 0.7425 0.815 0.3464 1.1274 0.815 0.7949 0.3173 0.0578
0.6917 37.0 1850 0.7421 0.8 0.3448 1.1909 0.8000 0.7732 0.2900 0.0639
0.6917 38.0 1900 0.7415 0.795 0.3471 1.1816 0.795 0.7594 0.2860 0.0655
0.6917 39.0 1950 0.7405 0.78 0.3502 1.1084 0.78 0.7491 0.2709 0.0650
0.6764 40.0 2000 0.7398 0.81 0.3457 1.1746 0.81 0.7797 0.2973 0.0603
0.6764 41.0 2050 0.7394 0.805 0.3437 1.1201 0.805 0.7764 0.2915 0.0626
0.6764 42.0 2100 0.7380 0.81 0.3420 1.0987 0.81 0.7861 0.2815 0.0583
0.6764 43.0 2150 0.7386 0.8 0.3437 1.1855 0.8000 0.7667 0.2804 0.0617
0.6764 44.0 2200 0.7398 0.795 0.3437 1.1138 0.795 0.7660 0.2719 0.0614
0.6764 45.0 2250 0.7384 0.805 0.3441 1.1100 0.805 0.7699 0.3065 0.0628
0.6764 46.0 2300 0.7389 0.79 0.3488 1.1079 0.79 0.7552 0.2615 0.0647
0.6764 47.0 2350 0.7368 0.8 0.3440 1.1095 0.8000 0.7698 0.2908 0.0624
0.6764 48.0 2400 0.7365 0.8 0.3452 1.0995 0.8000 0.7739 0.2838 0.0645
0.6764 49.0 2450 0.7365 0.8 0.3367 1.0442 0.8000 0.7712 0.2735 0.0585
0.6662 50.0 2500 0.7342 0.815 0.3379 1.1009 0.815 0.7815 0.2964 0.0584
0.6662 51.0 2550 0.7340 0.805 0.3358 1.0985 0.805 0.7723 0.2635 0.0593
0.6662 52.0 2600 0.7370 0.8 0.3429 1.1227 0.8000 0.7709 0.2841 0.0603
0.6662 53.0 2650 0.7325 0.81 0.3380 1.1110 0.81 0.7790 0.3022 0.0601
0.6662 54.0 2700 0.7320 0.8 0.3363 1.0621 0.8000 0.7647 0.2815 0.0607
0.6662 55.0 2750 0.7324 0.805 0.3321 0.9926 0.805 0.7693 0.2972 0.0600
0.6662 56.0 2800 0.7318 0.805 0.3364 1.0537 0.805 0.7681 0.2554 0.0612
0.6662 57.0 2850 0.7311 0.82 0.3355 1.1133 0.82 0.7862 0.2776 0.0594
0.6662 58.0 2900 0.7317 0.81 0.3331 1.0662 0.81 0.7797 0.2600 0.0579
0.6662 59.0 2950 0.7327 0.805 0.3382 1.1876 0.805 0.7735 0.2797 0.0621
0.6577 60.0 3000 0.7322 0.8 0.3356 1.1864 0.8000 0.7680 0.2797 0.0612
0.6577 61.0 3050 0.7327 0.795 0.3391 1.1347 0.795 0.7614 0.2883 0.0641
0.6577 62.0 3100 0.7315 0.815 0.3364 1.1227 0.815 0.7848 0.2681 0.0599
0.6577 63.0 3150 0.7316 0.805 0.3392 1.0608 0.805 0.7717 0.2742 0.0632
0.6577 64.0 3200 0.7313 0.82 0.3341 1.0601 0.82 0.7878 0.2950 0.0583
0.6577 65.0 3250 0.7322 0.805 0.3388 1.1837 0.805 0.7747 0.2806 0.0638
0.6577 66.0 3300 0.7311 0.805 0.3373 1.0157 0.805 0.7757 0.2880 0.0629
0.6577 67.0 3350 0.7310 0.805 0.3344 1.1878 0.805 0.7766 0.2499 0.0609
0.6577 68.0 3400 0.7326 0.805 0.3391 1.0847 0.805 0.7729 0.2824 0.0636
0.6577 69.0 3450 0.7302 0.805 0.3376 1.1932 0.805 0.7778 0.2789 0.0617
0.6528 70.0 3500 0.7305 0.81 0.3359 0.9988 0.81 0.7787 0.2769 0.0622
0.6528 71.0 3550 0.7300 0.81 0.3328 1.0833 0.81 0.7776 0.2914 0.0594
0.6528 72.0 3600 0.7300 0.81 0.3343 1.1426 0.81 0.7776 0.2843 0.0594
0.6528 73.0 3650 0.7285 0.805 0.3341 1.1237 0.805 0.7701 0.2723 0.0614
0.6528 74.0 3700 0.7303 0.81 0.3368 1.1928 0.81 0.7768 0.2926 0.0612
0.6528 75.0 3750 0.7290 0.805 0.3318 1.0669 0.805 0.7709 0.2810 0.0603
0.6528 76.0 3800 0.7316 0.8 0.3382 1.1392 0.8000 0.7687 0.2505 0.0636
0.6528 77.0 3850 0.7284 0.8 0.3337 1.1338 0.8000 0.7720 0.2677 0.0610
0.6528 78.0 3900 0.7303 0.805 0.3373 1.1969 0.805 0.7729 0.2745 0.0618
0.6528 79.0 3950 0.7297 0.805 0.3369 1.1970 0.805 0.7743 0.2731 0.0606
0.6489 80.0 4000 0.7296 0.795 0.3362 1.1328 0.795 0.7656 0.2620 0.0627
0.6489 81.0 4050 0.7295 0.805 0.3363 1.1358 0.805 0.7726 0.2540 0.0608
0.6489 82.0 4100 0.7290 0.795 0.3341 1.1389 0.795 0.7668 0.2661 0.0630
0.6489 83.0 4150 0.7289 0.8 0.3364 1.0597 0.8000 0.7678 0.2838 0.0615
0.6489 84.0 4200 0.7291 0.805 0.3351 1.1277 0.805 0.7743 0.2621 0.0608
0.6489 85.0 4250 0.7297 0.795 0.3353 1.1953 0.795 0.7668 0.2666 0.0622
0.6489 86.0 4300 0.7286 0.805 0.3339 1.1278 0.805 0.7735 0.2668 0.0608
0.6489 87.0 4350 0.7298 0.8 0.3361 1.1423 0.8000 0.7677 0.2613 0.0614
0.6489 88.0 4400 0.7296 0.805 0.3346 1.1927 0.805 0.7743 0.2789 0.0612
0.6489 89.0 4450 0.7299 0.8 0.3359 1.1950 0.8000 0.7686 0.2500 0.0613
0.6462 90.0 4500 0.7297 0.805 0.3354 1.1934 0.805 0.7743 0.2939 0.0613
0.6462 91.0 4550 0.7294 0.8 0.3353 1.1313 0.8000 0.7685 0.2808 0.0610
0.6462 92.0 4600 0.7297 0.805 0.3356 1.1349 0.805 0.7765 0.2668 0.0614
0.6462 93.0 4650 0.7298 0.8 0.3354 1.1954 0.8000 0.7685 0.2700 0.0613
0.6462 94.0 4700 0.7301 0.8 0.3362 1.1951 0.8000 0.7677 0.2722 0.0616
0.6462 95.0 4750 0.7299 0.805 0.3360 1.1957 0.805 0.7743 0.2619 0.0614
0.6462 96.0 4800 0.7299 0.805 0.3357 1.1946 0.805 0.7743 0.2892 0.0611
0.6462 97.0 4850 0.7297 0.8 0.3355 1.1954 0.8000 0.7686 0.2703 0.0613
0.6462 98.0 4900 0.7298 0.8 0.3359 1.1952 0.8000 0.7677 0.2892 0.0615
0.6462 99.0 4950 0.7298 0.8 0.3357 1.1951 0.8000 0.7677 0.2720 0.0614
0.645 100.0 5000 0.7298 0.8 0.3356 1.1950 0.8000 0.7677 0.2868 0.0614

Framework versions

  • Transformers 4.36.0.dev0
  • Pytorch 2.2.0.dev20231112+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
0
Safetensors
Model size
5.63M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for jordyvl/dit-base_tobacco-tiny_tobacco3482_simkd

Finetuned
(13)
this model