Edit model card

resnet101_rvl-cdip-cnn_rvl_cdip-NK1000_kd_CEKD_t2.5_a0.5

This model is a fine-tuned version of microsoft/resnet-50 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5837
  • Accuracy: 0.7867
  • Brier Loss: 0.3013
  • Nll: 1.9882
  • F1 Micro: 0.7868
  • F1 Macro: 0.7860
  • Ece: 0.0529
  • Aurc: 0.0581

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 250 4.1958 0.1035 0.9350 9.1004 0.1035 0.0792 0.0472 0.9013
4.2322 2.0 500 4.0778 0.173 0.9251 6.5742 0.173 0.1393 0.0993 0.7501
4.2322 3.0 750 3.6484 0.339 0.8778 4.9108 0.339 0.2957 0.2172 0.5305
3.5256 4.0 1000 2.5967 0.4592 0.6991 3.3640 0.4592 0.4220 0.1274 0.3285
3.5256 5.0 1250 2.0345 0.5417 0.6078 3.0118 0.5417 0.5180 0.0976 0.2447
1.9172 6.0 1500 1.4417 0.625 0.5029 2.7890 0.625 0.6123 0.0549 0.1623
1.9172 7.0 1750 1.3298 0.639 0.4852 2.6110 0.639 0.6320 0.0558 0.1501
1.1801 8.0 2000 1.1697 0.674 0.4473 2.4787 0.674 0.6712 0.0466 0.1283
1.1801 9.0 2250 0.9625 0.7093 0.4020 2.3242 0.7093 0.7085 0.0526 0.1017
0.8029 10.0 2500 0.9477 0.7215 0.3893 2.3193 0.7215 0.7228 0.0515 0.0971
0.8029 11.0 2750 0.8527 0.7375 0.3692 2.2785 0.7375 0.7377 0.0490 0.0870
0.5717 12.0 3000 0.7377 0.7515 0.3470 2.1475 0.7515 0.7529 0.0552 0.0757
0.5717 13.0 3250 0.7309 0.7498 0.3469 2.1250 0.7498 0.7494 0.0589 0.0758
0.4414 14.0 3500 0.7165 0.7558 0.3427 2.1045 0.7558 0.7576 0.0582 0.0721
0.4414 15.0 3750 0.6865 0.7678 0.3319 2.0457 0.7678 0.7688 0.0551 0.0697
0.3691 16.0 4000 0.7002 0.7662 0.3348 2.1280 0.7663 0.7664 0.0567 0.0698
0.3691 17.0 4250 0.6896 0.7628 0.3326 2.0750 0.7628 0.7631 0.0608 0.0691
0.3214 18.0 4500 0.6666 0.7715 0.3258 2.0468 0.7715 0.7707 0.0544 0.0680
0.3214 19.0 4750 0.6735 0.7702 0.3277 2.0544 0.7702 0.7700 0.0571 0.0681
0.2914 20.0 5000 0.6607 0.772 0.3241 2.0364 0.772 0.7729 0.0525 0.0659
0.2914 21.0 5250 0.6625 0.7688 0.3217 2.0387 0.7688 0.7703 0.0455 0.0664
0.2653 22.0 5500 0.6543 0.775 0.3200 2.0560 0.775 0.7752 0.0507 0.0647
0.2653 23.0 5750 0.6409 0.7725 0.3188 2.0091 0.7725 0.7733 0.0554 0.0647
0.2482 24.0 6000 0.6452 0.7758 0.3191 2.0256 0.7758 0.7756 0.0502 0.0655
0.2482 25.0 6250 0.6401 0.7742 0.3196 2.0668 0.7742 0.7745 0.0528 0.0648
0.2354 26.0 6500 0.6316 0.775 0.3171 2.0150 0.775 0.7755 0.0555 0.0634
0.2354 27.0 6750 0.6257 0.7808 0.3147 2.0129 0.7808 0.7808 0.0503 0.0624
0.2229 28.0 7000 0.6343 0.7778 0.3144 2.0910 0.7778 0.7776 0.0510 0.0624
0.2229 29.0 7250 0.6206 0.781 0.3115 2.0399 0.7810 0.7798 0.0555 0.0606
0.2147 30.0 7500 0.6262 0.777 0.3124 2.0603 0.777 0.7772 0.0539 0.0616
0.2147 31.0 7750 0.6265 0.7788 0.3137 2.0833 0.7788 0.7777 0.0532 0.0614
0.2058 32.0 8000 0.6134 0.7815 0.3119 2.0369 0.7815 0.7815 0.0514 0.0615
0.2058 33.0 8250 0.6153 0.7772 0.3133 2.0513 0.7773 0.7772 0.0534 0.0623
0.1994 34.0 8500 0.6143 0.7853 0.3098 2.0188 0.7853 0.7857 0.0508 0.0611
0.1994 35.0 8750 0.6096 0.7827 0.3086 2.0134 0.7828 0.7828 0.0512 0.0606
0.1932 36.0 9000 0.6094 0.784 0.3067 2.0151 0.7840 0.7847 0.0471 0.0602
0.1932 37.0 9250 0.6142 0.7833 0.3111 2.0213 0.7833 0.7829 0.0542 0.0608
0.1895 38.0 9500 0.6103 0.7812 0.3094 2.0594 0.7812 0.7799 0.0529 0.0603
0.1895 39.0 9750 0.6059 0.781 0.3078 2.0386 0.7810 0.7806 0.0545 0.0607
0.1848 40.0 10000 0.6042 0.782 0.3072 2.0133 0.782 0.7824 0.0527 0.0603
0.1848 41.0 10250 0.5991 0.785 0.3043 2.0124 0.785 0.7853 0.0496 0.0594
0.1793 42.0 10500 0.6034 0.784 0.3058 2.0607 0.7840 0.7838 0.0490 0.0599
0.1793 43.0 10750 0.6047 0.7827 0.3068 2.0139 0.7828 0.7819 0.0492 0.0595
0.1768 44.0 11000 0.5982 0.785 0.3057 2.0303 0.785 0.7843 0.0473 0.0596
0.1768 45.0 11250 0.6036 0.7795 0.3087 2.0173 0.7795 0.7788 0.0549 0.0607
0.1743 46.0 11500 0.5974 0.785 0.3060 2.0026 0.785 0.7839 0.0478 0.0596
0.1743 47.0 11750 0.5996 0.782 0.3068 2.0144 0.782 0.7825 0.0480 0.0598
0.1707 48.0 12000 0.5958 0.7833 0.3079 2.0344 0.7833 0.7827 0.0500 0.0598
0.1707 49.0 12250 0.5969 0.782 0.3060 2.0162 0.782 0.7820 0.0482 0.0597
0.1683 50.0 12500 0.5933 0.784 0.3043 1.9897 0.7840 0.7836 0.0496 0.0589
0.1683 51.0 12750 0.5935 0.7833 0.3042 2.0142 0.7833 0.7829 0.0501 0.0586
0.1649 52.0 13000 0.5950 0.7847 0.3050 2.0125 0.7847 0.7851 0.0475 0.0591
0.1649 53.0 13250 0.5904 0.7837 0.3020 1.9830 0.7837 0.7837 0.0504 0.0584
0.1636 54.0 13500 0.5926 0.785 0.3042 2.0006 0.785 0.7845 0.0493 0.0588
0.1636 55.0 13750 0.5885 0.7847 0.3029 2.0025 0.7847 0.7843 0.0505 0.0585
0.1616 56.0 14000 0.5920 0.788 0.3041 2.0174 0.788 0.7878 0.0520 0.0591
0.1616 57.0 14250 0.5927 0.7863 0.3033 2.0321 0.7863 0.7858 0.0521 0.0588
0.1592 58.0 14500 0.5878 0.787 0.3017 1.9751 0.787 0.7874 0.0461 0.0584
0.1592 59.0 14750 0.5888 0.7867 0.3030 1.9996 0.7868 0.7864 0.0494 0.0582
0.1585 60.0 15000 0.5929 0.786 0.3052 2.0237 0.786 0.7857 0.0512 0.0584
0.1585 61.0 15250 0.5894 0.7865 0.3026 1.9895 0.7865 0.7864 0.0548 0.0585
0.1562 62.0 15500 0.5903 0.7873 0.3033 1.9670 0.7873 0.7870 0.0481 0.0584
0.1562 63.0 15750 0.5896 0.7853 0.3023 1.9681 0.7853 0.7850 0.0520 0.0587
0.1548 64.0 16000 0.5903 0.7847 0.3027 1.9865 0.7847 0.7846 0.0506 0.0587
0.1548 65.0 16250 0.5910 0.7853 0.3039 2.0009 0.7853 0.7849 0.0515 0.0593
0.1537 66.0 16500 0.5866 0.7883 0.3012 1.9561 0.7883 0.7881 0.0447 0.0581
0.1537 67.0 16750 0.5858 0.7867 0.3009 1.9868 0.7868 0.7861 0.0486 0.0577
0.1526 68.0 17000 0.5886 0.7867 0.3024 2.0009 0.7868 0.7862 0.0530 0.0587
0.1526 69.0 17250 0.5850 0.7863 0.3010 2.0095 0.7863 0.7860 0.0510 0.0581
0.1508 70.0 17500 0.5867 0.7865 0.3019 2.0304 0.7865 0.7861 0.0525 0.0583
0.1508 71.0 17750 0.5895 0.7857 0.3038 2.0013 0.7857 0.7853 0.0478 0.0586
0.15 72.0 18000 0.5894 0.7847 0.3025 2.0051 0.7847 0.7845 0.0500 0.0586
0.15 73.0 18250 0.5867 0.7865 0.3022 1.9634 0.7865 0.7860 0.0489 0.0582
0.149 74.0 18500 0.5888 0.7857 0.3026 1.9817 0.7857 0.7851 0.0497 0.0584
0.149 75.0 18750 0.5823 0.7885 0.2994 1.9873 0.7885 0.7880 0.0476 0.0577
0.1483 76.0 19000 0.5866 0.7853 0.3025 1.9870 0.7853 0.7849 0.0531 0.0583
0.1483 77.0 19250 0.5866 0.7867 0.3013 1.9933 0.7868 0.7862 0.0498 0.0577
0.1478 78.0 19500 0.5844 0.787 0.3010 1.9793 0.787 0.7868 0.0465 0.0579
0.1478 79.0 19750 0.5850 0.7857 0.3005 1.9856 0.7857 0.7855 0.0489 0.0580
0.1463 80.0 20000 0.5829 0.7893 0.2999 2.0003 0.7893 0.7890 0.0543 0.0578
0.1463 81.0 20250 0.5845 0.7867 0.3011 2.0178 0.7868 0.7864 0.0494 0.0580
0.1457 82.0 20500 0.5878 0.7865 0.3022 2.0108 0.7865 0.7861 0.0507 0.0583
0.1457 83.0 20750 0.5862 0.7865 0.3016 1.9996 0.7865 0.7865 0.0505 0.0585
0.1452 84.0 21000 0.5851 0.7863 0.3011 2.0002 0.7863 0.7859 0.0481 0.0582
0.1452 85.0 21250 0.5850 0.787 0.3013 1.9659 0.787 0.7867 0.0524 0.0582
0.1449 86.0 21500 0.5878 0.7867 0.3023 1.9837 0.7868 0.7866 0.0526 0.0581
0.1449 87.0 21750 0.5844 0.7873 0.3010 1.9807 0.7873 0.7865 0.0522 0.0577
0.1437 88.0 22000 0.5846 0.7877 0.3012 1.9947 0.7877 0.7869 0.0464 0.0580
0.1437 89.0 22250 0.5859 0.787 0.3016 2.0002 0.787 0.7867 0.0503 0.0581
0.143 90.0 22500 0.5838 0.7865 0.3010 1.9996 0.7865 0.7859 0.0496 0.0576
0.143 91.0 22750 0.5843 0.7837 0.3011 1.9683 0.7837 0.7834 0.0501 0.0583
0.1426 92.0 23000 0.5843 0.7873 0.3010 1.9960 0.7873 0.7870 0.0524 0.0578
0.1426 93.0 23250 0.5827 0.7847 0.3005 1.9719 0.7847 0.7844 0.0506 0.0579
0.1428 94.0 23500 0.5831 0.7865 0.3009 1.9781 0.7865 0.7862 0.0517 0.0579
0.1428 95.0 23750 0.5821 0.784 0.3001 1.9641 0.7840 0.7838 0.0505 0.0579
0.1424 96.0 24000 0.5850 0.7845 0.3020 1.9667 0.7845 0.7842 0.0526 0.0584
0.1424 97.0 24250 0.5850 0.7847 0.3012 1.9776 0.7847 0.7844 0.0508 0.0579
0.142 98.0 24500 0.5845 0.7877 0.3011 1.9745 0.7877 0.7870 0.0491 0.0579
0.142 99.0 24750 0.5834 0.7853 0.3010 1.9679 0.7853 0.7852 0.0506 0.0581
0.1416 100.0 25000 0.5837 0.7867 0.3013 1.9882 0.7868 0.7860 0.0529 0.0581

Framework versions

  • Transformers 4.33.3
  • Pytorch 2.2.0.dev20231002
  • Datasets 2.7.1
  • Tokenizers 0.13.3
Downloads last month
4
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for bdpc/resnet101_rvl-cdip-cnn_rvl_cdip-NK1000_kd_CEKD_t2.5_a0.5

Finetuned
(123)
this model