Bill Psomas commited on
Commit
abfe1a7
1 Parent(s): 5bd9f96

update readme

Browse files
Files changed (4) hide show
  1. README.md +53 -0
  2. checkpoint.pth +3 -0
  3. configs.yaml +45 -0
  4. log.txt +300 -0
README.md CHANGED
@@ -1,3 +1,56 @@
1
  ---
2
  license: cc-by-4.0
 
 
 
 
 
 
 
 
 
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: cc-by-4.0
3
+ datasets:
4
+ - imagenet-1k
5
+ metrics:
6
+ - accuracy
7
+ pipeline_tag: image-classification
8
+ language:
9
+ - en
10
+ tags:
11
+ - vision transformer
12
+ - simpool
13
+ - dino
14
+ - computer vision
15
+ - deep learning
16
  ---
17
+
18
+ # Self-supervised ViT-S/16 (small-sized Vision Transformer with patch size 16) model with SimPool
19
+
20
+ ViT-S model with SimPool (gamma=1.25) trained on ImageNet-1k for 300 epochs. Self-supervision with [DINO](https://arxiv.org/abs/2104.14294).
21
+
22
+ SimPool is a simple attention-based pooling method at the end of network, introduced on this ICCV 2023 [paper](https://arxiv.org/pdf/2309.06891.pdf) and released in this [repository](https://github.com/billpsomas/simpool/).
23
+ Disclaimer: This model card is written by the author of SimPool, i.e. [Bill Psomas](http://users.ntua.gr/psomasbill/).
24
+
25
+ ## Motivation
26
+
27
+ Convolutional networks and vision transformers have different forms of pairwise interactions, pooling across layers and pooling at the end of the network. Does the latter really need to be different?
28
+ As a by-product of pooling, vision transformers provide spatial attention for free, but this is most often of low quality unless self-supervised, which is not well studied. Is supervision really the problem?
29
+
30
+ ## Method
31
+
32
+ SimPool is a simple attention-based pooling mechanism as a replacement of the default one for both convolutional and transformer encoders. For transformers, we completely discard the [CLS] token.
33
+ Interestingly, we find that, whether supervised or self-supervised, SimPool improves performance on pre-training and downstream tasks and provides attention maps delineating object boundaries in all cases.
34
+ One could thus call SimPool universal.
35
+
36
+ ## Evaluation with k-NN
37
+
38
+ | k | top1 | top5 |
39
+ | ------- | ------- | ------- |
40
+ | 10 | 72.56 | 87.638 |
41
+ | 20 | 72.434 | 89.24 |
42
+ | 100 | 70.526 | 90.582 |
43
+ | 200 | 69.33 | 90.424 |
44
+
45
+ ## BibTeX entry and citation info
46
+
47
+ ```
48
+ @misc{psomas2023simpool,
49
+ title={Keep It SimPool: Who Said Supervised Transformers Suffer from Attention Deficit?},
50
+ author={Bill Psomas and Ioannis Kakogeorgiou and Konstantinos Karantzalos and Yannis Avrithis},
51
+ year={2023},
52
+ eprint={2309.06891},
53
+ archivePrefix={arXiv},
54
+ primaryClass={cs.CV}
55
+ }
56
+ ```
checkpoint.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d27b4a8e32efc2e82fa5d0dd9568d4667a4cbb0e953e90d654ff805bb729c4cd
3
+ size 709502528
configs.yaml ADDED
@@ -0,0 +1,45 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ arch: vit_small
2
+ backend: nccl
3
+ batch_size_per_gpu: 128
4
+ clip_grad: 0.0
5
+ data_path: /path/to/imagenet/
6
+ dist_url: env://
7
+ drop_path_rate: 0.1
8
+ epochs: 300
9
+ eval_every: 100
10
+ freeze_last_layer: 1
11
+ global_crops_scale:
12
+ - 0.25
13
+ - 1.0
14
+ local_crops_number: 6
15
+ local_crops_scale:
16
+ - 0.05
17
+ - 0.25
18
+ local_rank: 0
19
+ lr: 0.0005
20
+ min_lr: 1.0e-05
21
+ mode: simpool
22
+ momentum_teacher: 0.996
23
+ nb_knn:
24
+ - 10
25
+ - 20
26
+ - 100
27
+ - 200
28
+ norm_last_layer: false
29
+ num_workers: 10
30
+ optimizer: adamw
31
+ out_dim: 65536
32
+ output_dir: /path/to/output/
33
+ patch_size: 16
34
+ saveckp_freq: 20
35
+ seed: 0
36
+ subset: -1
37
+ teacher_temp: 0.07
38
+ temperature: 0.07
39
+ use_bn_in_head: false
40
+ use_fp16: false
41
+ warmup_epochs: 10
42
+ warmup_teacher_temp: 0.04
43
+ warmup_teacher_temp_epochs: 30
44
+ weight_decay: 0.04
45
+ weight_decay_end: 0.4
log.txt ADDED
@@ -0,0 +1,300 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {"train_loss": 9.071028601541984, "train_entropy": 7.519578872086238, "train_KL_div": 1.5514497012912416, "train_lr": 9.992805180270203e-05, "train_wd": 0.04000328590649605, "epoch": 0, "k-NN": {"10": {"top1": 2.278, "top5": 5.728}, "20": {"top1": 2.582, "top5": 6.244}, "100": {"top1": 2.902, "top5": 8.146}, "200": {"top1": 3.032, "top5": 8.748}}}
2
+ {"train_loss": 9.690155197199967, "train_entropy": 8.997641174770374, "train_KL_div": 0.6925140092377182, "train_lr": 0.00029994404029099034, "train_wd": 0.04002301668527509, "epoch": 1}
3
+ {"train_loss": 9.653354346799812, "train_entropy": 8.536663297936023, "train_KL_div": 1.1166910561952563, "train_lr": 0.0004999600287792787, "train_wd": 0.040062483968430894, "epoch": 2}
4
+ {"train_loss": 8.541468389123844, "train_entropy": 6.856502194008191, "train_KL_div": 1.6849661850266986, "train_lr": 0.0006999760172675668, "train_wd": 0.040121683427931, "epoch": 3}
5
+ {"train_loss": 7.805606635068532, "train_entropy": 5.719026180098859, "train_KL_div": 2.0865804372216874, "train_lr": 0.0008999920057558545, "train_wd": 0.04020060857188757, "epoch": 4}
6
+ {"train_loss": 7.328848383552451, "train_entropy": 4.91031758073899, "train_KL_div": 2.4185307727252647, "train_lr": 0.0011000079942441437, "train_wd": 0.04029925074526934, "epoch": 5}
7
+ {"train_loss": 6.806116920580967, "train_entropy": 4.3143443720136805, "train_KL_div": 2.491772541825434, "train_lr": 0.0013000239827324325, "train_wd": 0.04041759913085018, "epoch": 6}
8
+ {"train_loss": 6.404553665341043, "train_entropy": 3.77957458666665, "train_KL_div": 2.624979100734305, "train_lr": 0.00150003997122072, "train_wd": 0.04055564075039605, "epoch": 7}
9
+ {"train_loss": 6.34396169406714, "train_entropy": 3.5368926924147863, "train_KL_div": 2.8070689989603776, "train_lr": 0.0017000559597090108, "train_wd": 0.04071336046608778, "epoch": 8}
10
+ {"train_loss": 6.032440011926311, "train_entropy": 3.346282668703561, "train_KL_div": 2.6861573532997944, "train_lr": 0.0019000719481972992, "train_wd": 0.040890740982181185, "epoch": 9}
11
+ {"train_loss": 5.466853112434979, "train_entropy": 2.7792011871755267, "train_KL_div": 2.6876519303575312, "train_lr": 0.001999980561975582, "train_wd": 0.04108776284690386, "epoch": 10}
12
+ {"train_loss": 5.1440666089240885, "train_entropy": 2.548439257793861, "train_KL_div": 2.5956273491767576, "train_lr": 0.0019998638432648466, "train_wd": 0.04130440445458789, "epoch": 11}
13
+ {"train_loss": 4.807573703267305, "train_entropy": 2.2934413135027905, "train_KL_div": 2.5141323826295867, "train_lr": 0.001999630372871095, "train_wd": 0.041540642048039866, "epoch": 12}
14
+ {"train_loss": 4.624160045914227, "train_entropy": 2.176948201206091, "train_KL_div": 2.447211830497836, "train_lr": 0.0019992801781931334, "train_wd": 0.04179644972114549, "epoch": 13}
15
+ {"train_loss": 4.353815336521866, "train_entropy": 1.9697262155090114, "train_KL_div": 2.3840891146164336, "train_lr": 0.001998813300327836, "train_wd": 0.04207179942171064, "epoch": 14}
16
+ {"train_loss": 4.164108518764174, "train_entropy": 1.8428064972805462, "train_KL_div": 2.3213020183628412, "train_lr": 0.0019982297940654087, "train_wd": 0.0423666609545378, "epoch": 15}
17
+ {"train_loss": 4.020655005693817, "train_entropy": 1.7676790564370861, "train_KL_div": 2.252975948363376, "train_lr": 0.0019975297278828894, "train_wd": 0.04268100198473705, "epoch": 16}
18
+ {"train_loss": 3.9309454837577236, "train_entropy": 1.741489443121006, "train_KL_div": 2.189456049868052, "train_lr": 0.0019967131839361495, "train_wd": 0.04301478804127214, "epoch": 17}
19
+ {"train_loss": 3.9220745496326783, "train_entropy": 1.7922747892131814, "train_KL_div": 2.1297997619084224, "train_lr": 0.0019957802580502448, "train_wd": 0.04336798252074061, "epoch": 18}
20
+ {"train_loss": 4.0479214114727355, "train_entropy": 1.9981370856650442, "train_KL_div": 2.0497843228298414, "train_lr": 0.001994731059708163, "train_wd": 0.04374054669138758, "epoch": 19}
21
+ {"train_loss": 4.132159635293589, "train_entropy": 2.095099870106585, "train_KL_div": 2.037059757635176, "train_lr": 0.00199356571203798, "train_wd": 0.04413243969735341, "epoch": 20}
22
+ {"train_loss": 4.335484990613352, "train_entropy": 2.3297610688242885, "train_KL_div": 2.0057239100920685, "train_lr": 0.0019922843517984116, "train_wd": 0.0445436185631536, "epoch": 21}
23
+ {"train_loss": 4.557045552584765, "train_entropy": 2.5689477555328706, "train_KL_div": 1.988097786391191, "train_lr": 0.001990887129362766, "train_wd": 0.04497403819839207, "epoch": 22}
24
+ {"train_loss": 4.717852031893963, "train_entropy": 2.7469881743454723, "train_KL_div": 1.970863871365714, "train_lr": 0.001989374208701291, "train_wd": 0.045423651402705455, "epoch": 23}
25
+ {"train_loss": 4.77339885903777, "train_entropy": 2.780778867092064, "train_KL_div": 1.992620011909212, "train_lr": 0.0019877457673619287, "train_wd": 0.04589240887093915, "epoch": 24}
26
+ {"train_loss": 4.766025608010906, "train_entropy": 2.7722432428269648, "train_KL_div": 1.9937823895784876, "train_lr": 0.001986001996449492, "train_wd": 0.04638025919855438, "epoch": 25}
27
+ {"train_loss": 4.729289272706286, "train_entropy": 2.7522178698929665, "train_KL_div": 1.9770714225385972, "train_lr": 0.0019841431006032352, "train_wd": 0.04688714888726504, "epoch": 26}
28
+ {"train_loss": 4.748164778442787, "train_entropy": 2.790469658269966, "train_KL_div": 1.957695136324679, "train_lr": 0.001982169297972825, "train_wd": 0.0474130223509048, "epoch": 27}
29
+ {"train_loss": 4.812136475797847, "train_entropy": 2.947943381173052, "train_KL_div": 1.8641931053569658, "train_lr": 0.001980080820192749, "train_wd": 0.04795782192152221, "epoch": 28}
30
+ {"train_loss": 4.764635677246167, "train_entropy": 2.9070474644311424, "train_KL_div": 1.8575882326594169, "train_lr": 0.00197787791235513, "train_wd": 0.04852148785570515, "epoch": 29}
31
+ {"train_loss": 4.7390431776035316, "train_entropy": 2.8947533815741826, "train_KL_div": 1.8442898174222233, "train_lr": 0.0019755608329809805, "train_wd": 0.04910395834113187, "epoch": 30}
32
+ {"train_loss": 4.712030937607817, "train_entropy": 2.876463182324128, "train_KL_div": 1.835567772054939, "train_lr": 0.0019731298539898377, "train_wd": 0.049705169503349717, "epoch": 31}
33
+ {"train_loss": 4.693795666777545, "train_entropy": 2.8671795815396175, "train_KL_div": 1.8266161024379883, "train_lr": 0.0019705852606678557, "train_wd": 0.05032505541277995, "epoch": 32}
34
+ {"train_loss": 4.669186210436977, "train_entropy": 2.845947951650162, "train_KL_div": 1.8232382780356373, "train_lr": 0.0019679273516343453, "train_wd": 0.05096354809194695, "epoch": 33}
35
+ {"train_loss": 4.675410150266666, "train_entropy": 2.8593797869533657, "train_KL_div": 1.8160303755105733, "train_lr": 0.0019651564388067214, "train_wd": 0.05162057752293346, "epoch": 34}
36
+ {"train_loss": 4.6479229153774915, "train_entropy": 2.8268308163070373, "train_KL_div": 1.821092109326169, "train_lr": 0.0019622728473638836, "train_wd": 0.052296071655058056, "epoch": 35}
37
+ {"train_loss": 4.638314727184584, "train_entropy": 2.818482079356313, "train_KL_div": 1.8198326637061666, "train_lr": 0.0019592769157080834, "train_wd": 0.052989956412777164, "epoch": 36}
38
+ {"train_loss": 4.652868549862831, "train_entropy": 2.8345300519971444, "train_KL_div": 1.8183385063346913, "train_lr": 0.001956168995425171, "train_wd": 0.05370215570380788, "epoch": 37}
39
+ {"train_loss": 4.677095753659638, "train_entropy": 2.861664951848183, "train_KL_div": 1.8154308186303512, "train_lr": 0.0019529494512433823, "train_wd": 0.054432591427471845, "epoch": 38}
40
+ {"train_loss": 4.798882739649688, "train_entropy": 3.020602538717165, "train_KL_div": 1.778280209854162, "train_lr": 0.0019496186609904936, "train_wd": 0.05518118348326112, "epoch": 39}
41
+ {"train_loss": 4.760169220842617, "train_entropy": 2.960729308051171, "train_KL_div": 1.7994399321593826, "train_lr": 0.0019461770155495125, "train_wd": 0.05594784977962063, "epoch": 40}
42
+ {"train_loss": 4.758055215330718, "train_entropy": 2.945499232132658, "train_KL_div": 1.8125560058297203, "train_lr": 0.001942624918812792, "train_wd": 0.05673250624295146, "epoch": 41}
43
+ {"train_loss": 4.7755836072585565, "train_entropy": 2.967174761539264, "train_KL_div": 1.8084088669334957, "train_lr": 0.0019389627876346292, "train_wd": 0.05753506682683021, "epoch": 42}
44
+ {"train_loss": 4.7493376855274665, "train_entropy": 2.9250384498414377, "train_KL_div": 1.8242992526597732, "train_lr": 0.0019351910517823622, "train_wd": 0.058355443521444686, "epoch": 43}
45
+ {"train_loss": 4.748515628319946, "train_entropy": 2.9214171314125155, "train_KL_div": 1.8270985189435198, "train_lr": 0.0019313101538859117, "train_wd": 0.05919354636324573, "epoch": 44}
46
+ {"train_loss": 4.735243491751017, "train_entropy": 2.9059675747542073, "train_KL_div": 1.8292759319932625, "train_lr": 0.0019273205493858493, "train_wd": 0.06004928344481208, "epoch": 45}
47
+ {"train_loss": 4.734902209062561, "train_entropy": 2.9007857456910524, "train_KL_div": 1.8341164734485529, "train_lr": 0.0019232227064799568, "train_wd": 0.060922560924929446, "epoch": 46}
48
+ {"train_loss": 4.724453870793708, "train_entropy": 2.896316776530062, "train_KL_div": 1.8281371094506802, "train_lr": 0.0019190171060682643, "train_wd": 0.061813283038881124, "epoch": 47}
49
+ {"train_loss": 4.722629787348253, "train_entropy": 2.8912768666740423, "train_KL_div": 1.8313529361471188, "train_lr": 0.001914704241696639, "train_wd": 0.06272135210895002, "epoch": 48}
50
+ {"train_loss": 4.721045850921306, "train_entropy": 2.8887343341402776, "train_KL_div": 1.8323115381619912, "train_lr": 0.0019102846194988215, "train_wd": 0.06364666855512914, "epoch": 49}
51
+ {"train_loss": 4.746379533236166, "train_entropy": 2.9298368919429354, "train_KL_div": 1.816542657981102, "train_lr": 0.00190575875813708, "train_wd": 0.06458913090604337, "epoch": 50}
52
+ {"train_loss": 4.72889602527344, "train_entropy": 2.9034196550039937, "train_KL_div": 1.8254763890537236, "train_lr": 0.00190112718874131, "train_wd": 0.06554863581007495, "epoch": 51}
53
+ {"train_loss": 4.7280727605358495, "train_entropy": 2.8993432008438735, "train_KL_div": 1.8287295800485581, "train_lr": 0.0018963904548467152, "train_wd": 0.06652507804669865, "epoch": 52}
54
+ {"train_loss": 4.7281605809522, "train_entropy": 2.8998455462886468, "train_KL_div": 1.8283150434660778, "train_lr": 0.0018915491123300198, "train_wd": 0.0675183505380198, "epoch": 53}
55
+ {"train_loss": 4.743489539046749, "train_entropy": 2.9146935528369067, "train_KL_div": 1.8287960072811082, "train_lr": 0.0018866037293442373, "train_wd": 0.06852834436051636, "epoch": 54}
56
+ {"train_loss": 4.768703983508521, "train_entropy": 2.941150638649313, "train_KL_div": 1.8275533609991546, "train_lr": 0.0018815548862519834, "train_wd": 0.06955494875698406, "epoch": 55}
57
+ {"train_loss": 4.811929588552287, "train_entropy": 2.9934682181651455, "train_KL_div": 1.818461388254242, "train_lr": 0.0018764031755573827, "train_wd": 0.07059805114868235, "epoch": 56}
58
+ {"train_loss": 4.815563330380655, "train_entropy": 2.992088271678685, "train_KL_div": 1.8234750728646247, "train_lr": 0.0018711492018365235, "train_wd": 0.07165753714767913, "epoch": 57}
59
+ {"train_loss": 4.829287687365672, "train_entropy": 2.9997139240292716, "train_KL_div": 1.829573781787158, "train_lr": 0.0018657935816665206, "train_wd": 0.07273329056939522, "epoch": 58}
60
+ {"train_loss": 4.83415547168131, "train_entropy": 2.999629109049682, "train_KL_div": 1.8345263904804805, "train_lr": 0.0018603369435531444, "train_wd": 0.07382519344534542, "epoch": 59}
61
+ {"train_loss": 4.839322497137635, "train_entropy": 3.0013297770758993, "train_KL_div": 1.8379927457784482, "train_lr": 0.001854779927857076, "train_wd": 0.07493312603607465, "epoch": 60}
62
+ {"train_loss": 4.844060171493809, "train_entropy": 3.0035087667304357, "train_KL_div": 1.840551420950966, "train_lr": 0.0018491231867187375, "train_wd": 0.0760569668442891, "epoch": 61}
63
+ {"train_loss": 4.8474929625277134, "train_entropy": 3.0066428308149606, "train_KL_div": 1.8408501567266924, "train_lr": 0.001843367383981784, "train_wd": 0.07719659262817957, "epoch": 62}
64
+ {"train_loss": 4.848464238939049, "train_entropy": 3.008748228672883, "train_KL_div": 1.8397160259892138, "train_lr": 0.001837513195115191, "train_wd": 0.07835187841493665, "epoch": 63}
65
+ {"train_loss": 4.8522648759883085, "train_entropy": 3.017057099335676, "train_KL_div": 1.8352077914942369, "train_lr": 0.0018315613071339782, "train_wd": 0.07952269751445465, "epoch": 64}
66
+ {"train_loss": 4.863723314780412, "train_entropy": 3.025951669322882, "train_KL_div": 1.8377716727822804, "train_lr": 0.001825512418518586, "train_wd": 0.08070892153322547, "epoch": 65}
67
+ {"train_loss": 4.8717373841100455, "train_entropy": 3.041487754010659, "train_KL_div": 1.8302496564712265, "train_lr": 0.001819367239132916, "train_wd": 0.081910420388418, "epoch": 66}
68
+ {"train_loss": 4.885654925156555, "train_entropy": 3.0536850785656418, "train_KL_div": 1.8319698605510732, "train_lr": 0.001813126490141012, "train_wd": 0.08312706232214329, "epoch": 67}
69
+ {"train_loss": 4.892397156674608, "train_entropy": 3.0589323639631463, "train_KL_div": 1.8334648157600208, "train_lr": 0.0018067909039224405, "train_wd": 0.08435871391590259, "epoch": 68}
70
+ {"train_loss": 4.898811392265735, "train_entropy": 3.0654112044379387, "train_KL_div": 1.833400209851974, "train_lr": 0.001800361223986334, "train_wd": 0.08560524010521922, "epoch": 69}
71
+ {"train_loss": 4.910386991253097, "train_entropy": 3.0754539242131913, "train_KL_div": 1.8349330941145179, "train_lr": 0.0017938382048841403, "train_wd": 0.0868665041944498, "epoch": 70}
72
+ {"train_loss": 4.920758879346718, "train_entropy": 3.0834174208122667, "train_KL_div": 1.8373414767112473, "train_lr": 0.001787222612121072, "train_wd": 0.08814236787177346, "epoch": 71}
73
+ {"train_loss": 4.934223731644719, "train_entropy": 3.0975239395761758, "train_KL_div": 1.836699815367242, "train_lr": 0.0017805152220662813, "train_wd": 0.08943269122436058, "epoch": 72}
74
+ {"train_loss": 4.951179161775026, "train_entropy": 3.115470321534825, "train_KL_div": 1.835708857094832, "train_lr": 0.0017737168218617137, "train_wd": 0.09073733275371443, "epoch": 73}
75
+ {"train_loss": 4.967166971007316, "train_entropy": 3.1319124390371886, "train_KL_div": 1.8352545485388843, "train_lr": 0.0017668282093297849, "train_wd": 0.0920561493911902, "epoch": 74}
76
+ {"train_loss": 4.980985495731604, "train_entropy": 3.1452269043853813, "train_KL_div": 1.8357586128820333, "train_lr": 0.0017598501928797203, "train_wd": 0.09338899651368121, "epoch": 75}
77
+ {"train_loss": 4.991505312595627, "train_entropy": 3.1579942982212055, "train_KL_div": 1.8335110387689681, "train_lr": 0.0017527835914126848, "train_wd": 0.09473572795948056, "epoch": 76}
78
+ {"train_loss": 5.007351383554945, "train_entropy": 3.1748133661935656, "train_KL_div": 1.8325380327985537, "train_lr": 0.0017456292342256927, "train_wd": 0.09609619604430945, "epoch": 77}
79
+ {"train_loss": 5.027176292013112, "train_entropy": 3.1947942564575125, "train_KL_div": 1.8323820605576278, "train_lr": 0.0017383879609142857, "train_wd": 0.09747025157751095, "epoch": 78}
80
+ {"train_loss": 5.043027326643324, "train_entropy": 3.2095010117661182, "train_KL_div": 1.8335263299808608, "train_lr": 0.0017310606212739875, "train_wd": 0.09885774387841192, "epoch": 79}
81
+ {"train_loss": 5.059857398104801, "train_entropy": 3.225287115735878, "train_KL_div": 1.8345703074185968, "train_lr": 0.0017236480752005907, "train_wd": 0.10025852079284558, "epoch": 80}
82
+ {"train_loss": 5.061882136727599, "train_entropy": 3.2269776945205617, "train_KL_div": 1.8349044609794036, "train_lr": 0.00171615119258925, "train_wd": 0.10167242870983802, "epoch": 81}
83
+ {"train_loss": 5.067160378924186, "train_entropy": 3.229616384068839, "train_KL_div": 1.8375440142590174, "train_lr": 0.001708570853232373, "train_wd": 0.10309931257845319, "epoch": 82}
84
+ {"train_loss": 5.078427195167847, "train_entropy": 3.2403027839559635, "train_KL_div": 1.838124430698933, "train_lr": 0.0017009079467163973, "train_wd": 0.10453901592479503, "epoch": 83}
85
+ {"train_loss": 5.087570182568164, "train_entropy": 3.251177722911278, "train_KL_div": 1.8363924839084955, "train_lr": 0.0016931633723173767, "train_wd": 0.1059913808691678, "epoch": 84}
86
+ {"train_loss": 5.116397328466344, "train_entropy": 3.2803785387608264, "train_KL_div": 1.8360188070246928, "train_lr": 0.0016853380388954517, "train_wd": 0.10745624814338853, "epoch": 85}
87
+ {"train_loss": 5.1356953812255375, "train_entropy": 3.303217595953831, "train_KL_div": 1.8324778066764917, "train_lr": 0.0016774328647881991, "train_wd": 0.10893345710825322, "epoch": 86}
88
+ {"train_loss": 5.156186008958413, "train_entropy": 3.3249422923314103, "train_KL_div": 1.831243732683569, "train_lr": 0.0016694487777028494, "train_wd": 0.11042284577115233, "epoch": 87}
89
+ {"train_loss": 5.169847260943229, "train_entropy": 3.338614489987409, "train_KL_div": 1.8312327895852492, "train_lr": 0.0016613867146074142, "train_wd": 0.1119242508038345, "epoch": 88}
90
+ {"train_loss": 5.211870736117176, "train_entropy": 3.3626094920982084, "train_KL_div": 1.84926126267222, "train_lr": 0.001653247621620758, "train_wd": 0.11343750756031944, "epoch": 89}
91
+ {"train_loss": 5.196844488406162, "train_entropy": 3.378119325728344, "train_KL_div": 1.8187251777695619, "train_lr": 0.0016450324539015165, "train_wd": 0.11496245009495011, "epoch": 90}
92
+ {"train_loss": 5.198224586119755, "train_entropy": 3.359407459422172, "train_KL_div": 1.8388171475782669, "train_lr": 0.0016367421755360532, "train_wd": 0.11649891118059391, "epoch": 91}
93
+ {"train_loss": 5.211713012745626, "train_entropy": 3.366711409400693, "train_KL_div": 1.8450016304076338, "train_lr": 0.0016283777594252881, "train_wd": 0.11804672232697795, "epoch": 92}
94
+ {"train_loss": 5.2268060767869775, "train_entropy": 3.3820145727633286, "train_KL_div": 1.8447915286206895, "train_lr": 0.0016199401871705337, "train_wd": 0.11960571379916764, "epoch": 93}
95
+ {"train_loss": 5.242130160998764, "train_entropy": 3.400630808729443, "train_KL_div": 1.841499357379312, "train_lr": 0.0016114304489583166, "train_wd": 0.12117571463618018, "epoch": 94}
96
+ {"train_loss": 5.254457070244302, "train_entropy": 3.413833241740005, "train_KL_div": 1.8406238446085097, "train_lr": 0.001602849543444135, "train_wd": 0.12275655266973115, "epoch": 95}
97
+ {"train_loss": 5.264728907772677, "train_entropy": 3.4221517288808725, "train_KL_div": 1.8425771920300789, "train_lr": 0.001594198477635301, "train_wd": 0.1243480545431163, "epoch": 96}
98
+ {"train_loss": 5.274195530812898, "train_entropy": 3.435975770822627, "train_KL_div": 1.8382197696265938, "train_lr": 0.001585478266772742, "train_wd": 0.12595004573021978, "epoch": 97}
99
+ {"train_loss": 5.299830599320973, "train_entropy": 3.460097352758967, "train_KL_div": 1.8397332624399023, "train_lr": 0.0015766899342118715, "train_wd": 0.12756235055465595, "epoch": 98}
100
+ {"train_loss": 5.321168333982869, "train_entropy": 3.4857572737834057, "train_KL_div": 1.8354110774114358, "train_lr": 0.001567834511302475, "train_wd": 0.12918479220903117, "epoch": 99}
101
+ {"train_loss": 5.343794101386143, "train_entropy": 3.5074493432883544, "train_KL_div": 1.8363447688180479, "train_lr": 0.0015589130372676992, "train_wd": 0.1308171927743354, "epoch": 100, "k-NN": {"10": {"top1": 67.722, "top5": 84.208}, "20": {"top1": 67.488, "top5": 86.026}, "100": {"top1": 65.314, "top5": 87.354}, "200": {"top1": 64.038, "top5": 86.994}}}
102
+ {"train_loss": 5.352567493582039, "train_entropy": 3.5169819394985646, "train_KL_div": 1.8355855702353325, "train_lr": 0.001549926559082073, "train_wd": 0.13245937323945048, "epoch": 101}
103
+ {"train_loss": 5.362016737699318, "train_entropy": 3.5243532884654574, "train_KL_div": 1.8376634624316919, "train_lr": 0.0015408761313486553, "train_wd": 0.13411115352078318, "epoch": 102}
104
+ {"train_loss": 5.379855998128438, "train_entropy": 3.5402033865023004, "train_KL_div": 1.8396526194995733, "train_lr": 0.0015317628161752707, "train_wd": 0.1357723524820118, "epoch": 103}
105
+ {"train_loss": 5.394928695772477, "train_entropy": 3.5540938279706893, "train_KL_div": 1.8408348844062796, "train_lr": 0.0015225876830498655, "train_wd": 0.13744278795395, "epoch": 104}
106
+ {"train_loss": 5.685450600967895, "train_entropy": 3.830968906601175, "train_KL_div": 1.8544816895545149, "train_lr": 0.0015133518087149835, "train_wd": 0.13912227675452407, "epoch": 105}
107
+ {"train_loss": 5.463800066714283, "train_entropy": 3.6488354697549563, "train_KL_div": 1.814964597947973, "train_lr": 0.0015040562770414372, "train_wd": 0.1408106347088607, "epoch": 106}
108
+ {"train_loss": 5.429864403298147, "train_entropy": 3.5880392271694808, "train_KL_div": 1.8418251778319967, "train_lr": 0.0014947021789010703, "train_wd": 0.14250767666948544, "epoch": 107}
109
+ {"train_loss": 5.423545105708874, "train_entropy": 3.584499695032335, "train_KL_div": 1.8390454122964903, "train_lr": 0.0014852906120387864, "train_wd": 0.1442132165366222, "epoch": 108}
110
+ {"train_loss": 5.428141814341648, "train_entropy": 3.5890839447935137, "train_KL_div": 1.8390578761946954, "train_lr": 0.0014758226809436757, "train_wd": 0.1459270672786049, "epoch": 109}
111
+ {"train_loss": 5.427920204320019, "train_entropy": 3.5907921860877465, "train_KL_div": 1.8371280224131736, "train_lr": 0.001466299496719435, "train_wd": 0.14764904095238682, "epoch": 110}
112
+ {"train_loss": 5.434187751165111, "train_entropy": 3.6015420021484794, "train_KL_div": 1.832645751446557, "train_lr": 0.001456722176953961, "train_wd": 0.14937894872414936, "epoch": 111}
113
+ {"train_loss": 5.45011526451027, "train_entropy": 3.618752071706892, "train_KL_div": 1.8313632070136776, "train_lr": 0.0014470918455881823, "train_wd": 0.15111660089001122, "epoch": 112}
114
+ {"train_loss": 5.506391493346003, "train_entropy": 3.7114407884464753, "train_KL_div": 1.7949507047923254, "train_lr": 0.0014374096327841935, "train_wd": 0.1528618068968313, "epoch": 113}
115
+ {"train_loss": 5.487069893702805, "train_entropy": 3.668940502629101, "train_KL_div": 1.8181293821997113, "train_lr": 0.0014276766747925915, "train_wd": 0.15461437536310293, "epoch": 114}
116
+ {"train_loss": 5.505935238229094, "train_entropy": 3.6776410462759097, "train_KL_div": 1.8282941957767442, "train_lr": 0.0014178941138191419, "train_wd": 0.1563741140999457, "epoch": 115}
117
+ {"train_loss": 5.516362538345331, "train_entropy": 3.69596640995557, "train_KL_div": 1.8203961313556996, "train_lr": 0.0014080630978907625, "train_wd": 0.1581408301321759, "epoch": 116}
118
+ {"train_loss": 5.541736549181904, "train_entropy": 3.7206478833580476, "train_KL_div": 1.8210886610950308, "train_lr": 0.0013981847807207559, "train_wd": 0.1599143297194714, "epoch": 117}
119
+ {"train_loss": 5.648248846630017, "train_entropy": 3.8351576289208196, "train_KL_div": 1.813091217780666, "train_lr": 0.001388260321573437, "train_wd": 0.16169441837761694, "epoch": 118}
120
+ {"train_loss": 5.569643164090783, "train_entropy": 3.752559046665256, "train_KL_div": 1.8170841127800808, "train_lr": 0.0013782908851280935, "train_wd": 0.16348090089983194, "epoch": 119}
121
+ {"train_loss": 5.568959839243016, "train_entropy": 3.7430124330482513, "train_KL_div": 1.8259474056944858, "train_lr": 0.0013682776413422959, "train_wd": 0.1652735813781757, "epoch": 120}
122
+ {"train_loss": 5.575918775191791, "train_entropy": 3.7480972049524075, "train_KL_div": 1.8278215679523946, "train_lr": 0.0013582217653145855, "train_wd": 0.16707226322503174, "epoch": 121}
123
+ {"train_loss": 5.631407411264287, "train_entropy": 3.7659894872150073, "train_KL_div": 1.8654179316368416, "train_lr": 0.001348124437146618, "train_wd": 0.1688767491946666, "epoch": 122}
124
+ {"train_loss": 5.609137050158305, "train_entropy": 3.8220101118230705, "train_KL_div": 1.7871269305094446, "train_lr": 0.0013379868418046143, "train_wd": 0.17068684140485862, "epoch": 123}
125
+ {"train_loss": 5.5796675442886965, "train_entropy": 3.7535208973476735, "train_KL_div": 1.8261466544690272, "train_lr": 0.0013278101689803352, "train_wd": 0.17250234135860018, "epoch": 124}
126
+ {"train_loss": 5.573730785450299, "train_entropy": 3.7498362553920104, "train_KL_div": 1.823894528235844, "train_lr": 0.0013175956129514628, "train_wd": 0.1743230499658613, "epoch": 125}
127
+ {"train_loss": 5.576761185265274, "train_entropy": 3.756297891302932, "train_KL_div": 1.8204632972737105, "train_lr": 0.0013073443724414284, "train_wd": 0.17614876756542594, "epoch": 126}
128
+ {"train_loss": 5.716454054668938, "train_entropy": 3.8510050665465094, "train_KL_div": 1.8654489765207258, "train_lr": 0.0012970576504787634, "train_wd": 0.17797929394678566, "epoch": 127}
129
+ {"train_loss": 5.657474957400565, "train_entropy": 3.9015341130234926, "train_KL_div": 1.7559408443651612, "train_lr": 0.0012867366542558961, "train_wd": 0.1798144283720939, "epoch": 128}
130
+ {"train_loss": 5.617955752032743, "train_entropy": 3.813660272615228, "train_KL_div": 1.8042954816449461, "train_lr": 0.0012763825949874954, "train_wd": 0.1816539695981824, "epoch": 129}
131
+ {"train_loss": 5.6396059048928615, "train_entropy": 3.8333629202023207, "train_KL_div": 1.8062429871442889, "train_lr": 0.0012659966877683238, "train_wd": 0.1834977158986244, "epoch": 130}
132
+ {"train_loss": 5.627552204900127, "train_entropy": 3.8250692975130396, "train_KL_div": 1.802482913652484, "train_lr": 0.0012555801514306441, "train_wd": 0.18534546508586172, "epoch": 131}
133
+ {"train_loss": 5.632187221928847, "train_entropy": 3.829860196315604, "train_KL_div": 1.8023270317237916, "train_lr": 0.0012451342084011758, "train_wd": 0.18719701453337284, "epoch": 132}
134
+ {"train_loss": 5.646723199805481, "train_entropy": 3.8471993638171282, "train_KL_div": 1.79952383859695, "train_lr": 0.0012346600845576493, "train_wd": 0.18905216119789686, "epoch": 133}
135
+ {"train_loss": 5.660236131563652, "train_entropy": 3.859651494059536, "train_KL_div": 1.800584631226808, "train_lr": 0.0012241590090849425, "train_wd": 0.19091070164169655, "epoch": 134}
136
+ {"train_loss": 5.665045032398306, "train_entropy": 3.8696297143431875, "train_KL_div": 1.7954153161373831, "train_lr": 0.001213632214330815, "train_wd": 0.19277243205486666, "epoch": 135}
137
+ {"train_loss": 5.673876413862578, "train_entropy": 3.8785615638434456, "train_KL_div": 1.7953148442421027, "train_lr": 0.0012030809356613137, "train_wd": 0.19463714827768913, "epoch": 136}
138
+ {"train_loss": 5.694471264438187, "train_entropy": 3.897827661771187, "train_KL_div": 1.796643596330135, "train_lr": 0.0011925064113157658, "train_wd": 0.19650464582301674, "epoch": 137}
139
+ {"train_loss": 5.728203303236469, "train_entropy": 3.9220843161467456, "train_KL_div": 1.806118986696648, "train_lr": 0.0011819098822614827, "train_wd": 0.19837471989869845, "epoch": 138}
140
+ {"train_loss": 5.700724693129865, "train_entropy": 3.9118387531414687, "train_KL_div": 1.7888859361886598, "train_lr": 0.0011712925920481395, "train_wd": 0.20024716543003776, "epoch": 139}
141
+ {"train_loss": 5.714419263968174, "train_entropy": 3.9212003172778016, "train_KL_div": 1.7932189491679438, "train_lr": 0.0011606557866618182, "train_wd": 0.20212177708228213, "epoch": 140}
142
+ {"train_loss": 5.714709861387166, "train_entropy": 3.9247941145841643, "train_KL_div": 1.7899157438847086, "train_lr": 0.0011500007143787896, "train_wd": 0.20399834928313787, "epoch": 141}
143
+ {"train_loss": 5.715840097478063, "train_entropy": 3.9258759188756858, "train_KL_div": 1.7899641739807541, "train_lr": 0.0011393286256190269, "train_wd": 0.20587667624531794, "epoch": 142}
144
+ {"train_loss": 5.719075407722681, "train_entropy": 3.926889518372637, "train_KL_div": 1.7921858686718533, "train_lr": 0.0011286407727994661, "train_wd": 0.20775655198910256, "epoch": 143}
145
+ {"train_loss": 5.723609644064991, "train_entropy": 3.9300865315371376, "train_KL_div": 1.7935231137428163, "train_lr": 0.0011179384101870204, "train_wd": 0.20963777036493306, "epoch": 144}
146
+ {"train_loss": 5.722265244483186, "train_entropy": 3.9335911371153323, "train_KL_div": 1.7886741096905763, "train_lr": 0.0011072227937513968, "train_wd": 0.21152012507601464, "epoch": 145}
147
+ {"train_loss": 5.736961868550662, "train_entropy": 3.9499529736171617, "train_KL_div": 1.7870088876318113, "train_lr": 0.0010964951810176948, "train_wd": 0.2134034097009406, "epoch": 146}
148
+ {"train_loss": 5.86180009635137, "train_entropy": 4.05431141939571, "train_KL_div": 1.8074886681174012, "train_lr": 0.0010857568309188327, "train_wd": 0.21528741771633014, "epoch": 147}
149
+ {"train_loss": 5.800990541299566, "train_entropy": 4.062831694964501, "train_KL_div": 1.7381588424400365, "train_lr": 0.0010750090036478002, "train_wd": 0.217171942519473, "epoch": 148}
150
+ {"train_loss": 5.782979243760296, "train_entropy": 4.006871146335304, "train_KL_div": 1.776108087145453, "train_lr": 0.0010642529605097886, "train_wd": 0.21905677745098834, "epoch": 149}
151
+ {"train_loss": 5.806451067507124, "train_entropy": 4.007266767996011, "train_KL_div": 1.7991842880047006, "train_lr": 0.0010534899637741457, "train_wd": 0.22094171581748476, "epoch": 150}
152
+ {"train_loss": 5.7994747336248125, "train_entropy": 4.036307452274836, "train_KL_div": 1.7631672752989853, "train_lr": 0.0010427212765262623, "train_wd": 0.22282655091422968, "epoch": 151}
153
+ {"train_loss": 5.798202089268527, "train_entropy": 4.022758474976992, "train_KL_div": 1.7754436078117335, "train_lr": 0.00103194816251933, "train_wd": 0.22471107604781423, "epoch": 152}
154
+ {"train_loss": 5.80420247451674, "train_entropy": 4.028265029477844, "train_KL_div": 1.7759374472186815, "train_lr": 0.001021171886026043, "train_wd": 0.22659508455882188, "epoch": 153}
155
+ {"train_loss": 5.8068096890247505, "train_entropy": 4.029490140702227, "train_KL_div": 1.7773195440344196, "train_lr": 0.0010103937116902315, "train_wd": 0.22847836984448705, "epoch": 154}
156
+ {"train_loss": 5.809986693395985, "train_entropy": 4.032551892631822, "train_KL_div": 1.7774347837904183, "train_lr": 0.0009996149043784357, "train_wd": 0.23036072538135743, "epoch": 155}
157
+ {"train_loss": 5.806160874646916, "train_entropy": 4.0297403921850385, "train_KL_div": 1.7764204737427327, "train_lr": 0.000988836729031487, "train_wd": 0.2322419447479335, "epoch": 156}
158
+ {"train_loss": 5.809986023451213, "train_entropy": 4.037986398672314, "train_KL_div": 1.7719996306393073, "train_lr": 0.000978060450516052, "train_wd": 0.23412182164731465, "epoch": 157}
159
+ {"train_loss": 5.820273596176998, "train_entropy": 4.049174525588155, "train_KL_div": 1.7710990617267615, "train_lr": 0.0009672873334761916, "train_wd": 0.2360001499298133, "epoch": 158}
160
+ {"train_loss": 5.829460150760998, "train_entropy": 4.059529521601568, "train_KL_div": 1.7699306199638296, "train_lr": 0.0009565186421849599, "train_wd": 0.23787672361556705, "epoch": 159}
161
+ {"train_loss": 5.920780453798201, "train_entropy": 4.129446741297758, "train_KL_div": 1.7913336956696353, "train_lr": 0.0009457556403960191, "train_wd": 0.23975133691712414, "epoch": 160}
162
+ {"train_loss": 5.853237970936879, "train_entropy": 4.097673060034486, "train_KL_div": 1.7555649197406524, "train_lr": 0.000934999591195357, "train_wd": 0.24162378426201056, "epoch": 161}
163
+ {"train_loss": 5.859795070523552, "train_entropy": 4.093456496485322, "train_KL_div": 1.7663385630559196, "train_lr": 0.0009242517568530358, "train_wd": 0.2434938603152749, "epoch": 162}
164
+ {"train_loss": 5.86798975650641, "train_entropy": 4.1001551269436725, "train_KL_div": 1.7678346246433296, "train_lr": 0.0009135133986750642, "train_wd": 0.24536136000200406, "epoch": 163}
165
+ {"train_loss": 5.874571724904241, "train_entropy": 4.107103958618727, "train_KL_div": 1.7674677681556041, "train_lr": 0.0009027857768553882, "train_wd": 0.24722607852981315, "epoch": 164}
166
+ {"train_loss": 5.877673099414527, "train_entropy": 4.113579340547108, "train_KL_div": 1.7640937509463368, "train_lr": 0.0008920701503279939, "train_wd": 0.2490878114112999, "epoch": 165}
167
+ {"train_loss": 5.88911056180271, "train_entropy": 4.1268034024918965, "train_KL_div": 1.762307150496377, "train_lr": 0.0008813677766191648, "train_wd": 0.2509463544864762, "epoch": 166}
168
+ {"train_loss": 5.908474380020901, "train_entropy": 4.1508879772574305, "train_KL_div": 1.7575863953069342, "train_lr": 0.0008706799116999062, "train_wd": 0.2528015039451491, "epoch": 167}
169
+ {"train_loss": 5.918202764839291, "train_entropy": 4.192322739880148, "train_KL_div": 1.725880017145265, "train_lr": 0.0008600078098385551, "train_wd": 0.254653056349276, "epoch": 168}
170
+ {"train_loss": 5.909525770268185, "train_entropy": 4.160937340449181, "train_KL_div": 1.7485884262574949, "train_lr": 0.0008493527234535819, "train_wd": 0.2565008086552697, "epoch": 169}
171
+ {"train_loss": 5.912968448978915, "train_entropy": 4.157037534230619, "train_KL_div": 1.755930897429121, "train_lr": 0.0008387159029666142, "train_wd": 0.2583445582362668, "epoch": 170}
172
+ {"train_loss": 5.910390038665631, "train_entropy": 4.156526599165728, "train_KL_div": 1.753863419333903, "train_lr": 0.000828098596655703, "train_wd": 0.2601841029043497, "epoch": 171}
173
+ {"train_loss": 5.9169490692807996, "train_entropy": 4.166448484674442, "train_KL_div": 1.7505005785553671, "train_lr": 0.0008175020505088136, "train_wd": 0.26201924093271484, "epoch": 172}
174
+ {"train_loss": 5.927290933595287, "train_entropy": 4.174429464254448, "train_KL_div": 1.752861465410077, "train_lr": 0.00080692750807762, "train_wd": 0.2638497710777995, "epoch": 173}
175
+ {"train_loss": 5.927755501725786, "train_entropy": 4.1769190441361435, "train_KL_div": 1.7508364490968147, "train_lr": 0.000796376210331563, "train_wd": 0.2656754926013423, "epoch": 174}
176
+ {"train_loss": 5.945073442255183, "train_entropy": 4.186669607123406, "train_KL_div": 1.7584038297597357, "train_lr": 0.0007858493955122119, "train_wd": 0.26749620529240564, "epoch": 175}
177
+ {"train_loss": 6.04566592683228, "train_entropy": 4.392409111765458, "train_KL_div": 1.6532567939121756, "train_lr": 0.0007753482989879629, "train_wd": 0.2693117094893247, "epoch": 176}
178
+ {"train_loss": 5.978560313808737, "train_entropy": 4.266186097971827, "train_KL_div": 1.712374197755405, "train_lr": 0.0007648741531090522, "train_wd": 0.27112180610160586, "epoch": 177}
179
+ {"train_loss": 5.953726047949253, "train_entropy": 4.2258376905576975, "train_KL_div": 1.7278883444200412, "train_lr": 0.0007544281870629359, "train_wd": 0.27292629663175794, "epoch": 178}
180
+ {"train_loss": 5.943421036838818, "train_entropy": 4.212485758473071, "train_KL_div": 1.7309352658349546, "train_lr": 0.0007440116267300488, "train_wd": 0.2747249831970586, "epoch": 179}
181
+ {"train_loss": 5.939117658147804, "train_entropy": 4.208252442040318, "train_KL_div": 1.730865202623782, "train_lr": 0.0007336256945399273, "train_wd": 0.2765176685512585, "epoch": 180}
182
+ {"train_loss": 5.938323804228712, "train_entropy": 4.208862644067104, "train_KL_div": 1.7294611459393963, "train_lr": 0.0007232716093277636, "train_wd": 0.27830415610620435, "epoch": 181}
183
+ {"train_loss": 5.942087706711462, "train_entropy": 4.211041556464301, "train_KL_div": 1.7310461325349091, "train_lr": 0.000712950586191366, "train_wd": 0.2800842499534056, "epoch": 182}
184
+ {"train_loss": 5.938182948590469, "train_entropy": 4.210236493489154, "train_KL_div": 1.7279464487168048, "train_lr": 0.0007026638363485578, "train_wd": 0.2818577548855112, "epoch": 183}
185
+ {"train_loss": 5.98858530487088, "train_entropy": 4.255549505269594, "train_KL_div": 1.7330357949201056, "train_lr": 0.0006924125669950476, "train_wd": 0.2836244764177193, "epoch": 184}
186
+ {"train_loss": 5.9674714513057525, "train_entropy": 4.286696881079655, "train_KL_div": 1.6807745603158128, "train_lr": 0.0006821979811627406, "train_wd": 0.2853842208091065, "epoch": 185}
187
+ {"train_loss": 5.957594391872748, "train_entropy": 4.24548126655898, "train_KL_div": 1.7121131077563638, "train_lr": 0.0006720212775785752, "train_wd": 0.2871367950838696, "epoch": 186}
188
+ {"train_loss": 5.953354545110326, "train_entropy": 4.240094133180013, "train_KL_div": 1.713260414014808, "train_lr": 0.0006618836505238369, "train_wd": 0.2888820070524873, "epoch": 187}
189
+ {"train_loss": 5.9536444725369, "train_entropy": 4.238713526873471, "train_KL_div": 1.7149309431024784, "train_lr": 0.000651786289694012, "train_wd": 0.29061966533280315, "epoch": 188}
190
+ {"train_loss": 5.949846254692947, "train_entropy": 4.23680592188351, "train_KL_div": 1.7130403159905394, "train_lr": 0.0006417303800591608, "train_wd": 0.2923495793710027, "epoch": 189}
191
+ {"train_loss": 5.946489517113192, "train_entropy": 4.23681537553275, "train_KL_div": 1.7096741155183002, "train_lr": 0.0006317171017248649, "train_wd": 0.2940715594625201, "epoch": 190}
192
+ {"train_loss": 5.945127653751633, "train_entropy": 4.236696942914113, "train_KL_div": 1.7084306893136194, "train_lr": 0.0006217476297937315, "train_wd": 0.29578541677283016, "epoch": 191}
193
+ {"train_loss": 5.945089780502944, "train_entropy": 4.238371372342014, "train_KL_div": 1.706718387875816, "train_lr": 0.000611823134227499, "train_wd": 0.2974909633581683, "epoch": 192}
194
+ {"train_loss": 5.943930092713625, "train_entropy": 4.240266465002017, "train_KL_div": 1.7036636062115311, "train_lr": 0.0006019447797097216, "train_wd": 0.2991880121861295, "epoch": 193}
195
+ {"train_loss": 5.94670016164307, "train_entropy": 4.242095719638774, "train_KL_div": 1.7046044218263847, "train_lr": 0.0005921137255091005, "train_wd": 0.3008763771561874, "epoch": 194}
196
+ {"train_loss": 5.945096118606442, "train_entropy": 4.243772328114338, "train_KL_div": 1.7013237745665628, "train_lr": 0.0005823311253434356, "train_wd": 0.3025558731200963, "epoch": 195}
197
+ {"train_loss": 5.947144784992166, "train_entropy": 4.245347125877103, "train_KL_div": 1.7017976445593328, "train_lr": 0.0005725981272442279, "train_wd": 0.30422631590219695, "epoch": 196}
198
+ {"train_loss": 5.945578572132605, "train_entropy": 4.246625127695161, "train_KL_div": 1.6989534340149683, "train_lr": 0.0005629158734219622, "train_wd": 0.3058875223196151, "epoch": 197}
199
+ {"train_loss": 5.940433853345333, "train_entropy": 4.246101958741196, "train_KL_div": 1.6943318832763952, "train_lr": 0.0005532855001320507, "train_wd": 0.3075393102023465, "epoch": 198}
200
+ {"train_loss": 5.943517345461628, "train_entropy": 4.249389858768998, "train_KL_div": 1.6941274723989501, "train_lr": 0.0005437081375414991, "train_wd": 0.3091814984132353, "epoch": 199}
201
+ {"train_loss": 5.940877528737584, "train_entropy": 4.249548657144383, "train_KL_div": 1.691328859800915, "train_lr": 0.0005341849095962706, "train_wd": 0.31081390686783883, "epoch": 200, "k-NN": {"10": {"top1": 69.6, "top5": 85.6}, "20": {"top1": 69.48, "top5": 87.388}, "100": {"top1": 67.394, "top5": 88.69}, "200": {"top1": 66.044, "top5": 88.468}}}
202
+ {"train_loss": 5.93931199973531, "train_entropy": 4.249362190969461, "train_KL_div": 1.6899497973428166, "train_lr": 0.0005247169338893862, "train_wd": 0.31243635655417534, "epoch": 201}
203
+ {"train_loss": 5.9366219949950985, "train_entropy": 4.2486697762942525, "train_KL_div": 1.6879522203684425, "train_lr": 0.0005153053215297802, "train_wd": 0.3140486695523516, "epoch": 202}
204
+ {"train_loss": 5.932137525005402, "train_entropy": 4.24702232151771, "train_KL_div": 1.6851151936369644, "train_lr": 0.0005059511770118849, "train_wd": 0.31565066905408046, "epoch": 203}
205
+ {"train_loss": 5.9281863743166845, "train_entropy": 4.245633288181657, "train_KL_div": 1.6825530821804424, "train_lr": 0.0004966555980860385, "train_wd": 0.31724217938206156, "epoch": 204}
206
+ {"train_loss": 5.924784926058863, "train_entropy": 4.245213072172267, "train_KL_div": 1.6795718484549975, "train_lr": 0.0004874196756296408, "train_wd": 0.31882302600925444, "epoch": 205}
207
+ {"train_loss": 5.920227467680244, "train_entropy": 4.240561024486113, "train_KL_div": 1.679666411890972, "train_lr": 0.00047824449351914115, "train_wd": 0.32039303557801124, "epoch": 206}
208
+ {"train_loss": 5.915632584398027, "train_entropy": 4.238494826068314, "train_KL_div": 1.6771377404626133, "train_lr": 0.0004691311285028425, "train_wd": 0.32195203591908944, "epoch": 207}
209
+ {"train_loss": 5.911727037218263, "train_entropy": 4.238794208644963, "train_KL_div": 1.6729328133862653, "train_lr": 0.0004600806500745353, "train_wd": 0.3234998560705342, "epoch": 208}
210
+ {"train_loss": 5.907188497668357, "train_entropy": 4.237878706505735, "train_KL_div": 1.6693097712824956, "train_lr": 0.0004510941203479906, "train_wd": 0.32503632629642326, "epoch": 209}
211
+ {"train_loss": 5.902216429428326, "train_entropy": 4.234456444482247, "train_KL_div": 1.6677599736064268, "train_lr": 0.0004421725939323164, "train_wd": 0.3265612781054832, "epoch": 210}
212
+ {"train_loss": 5.897876906976235, "train_entropy": 4.232939002706374, "train_KL_div": 1.6649378926681577, "train_lr": 0.00043331711780819393, "train_wd": 0.3280745442695616, "epoch": 211}
213
+ {"train_loss": 5.895790366865367, "train_entropy": 4.232371793090582, "train_KL_div": 1.6634185474386796, "train_lr": 0.0004245287312050084, "train_wd": 0.3295759588419709, "epoch": 212}
214
+ {"train_loss": 5.895929314392648, "train_entropy": 4.236182002641028, "train_KL_div": 1.659747297493674, "train_lr": 0.0004158084654788951, "train_wd": 0.33106535717568547, "epoch": 213}
215
+ {"train_loss": 5.895643451564508, "train_entropy": 4.255818578193513, "train_KL_div": 1.6398248686540804, "train_lr": 0.0004071573439916995, "train_wd": 0.3325425759413923, "epoch": 214}
216
+ {"train_loss": 5.885834379543027, "train_entropy": 4.239514388173795, "train_KL_div": 1.646319966641166, "train_lr": 0.0003985763819908853, "train_wd": 0.33400745314540786, "epoch": 215}
217
+ {"train_loss": 5.87702196593479, "train_entropy": 4.2314336108599155, "train_KL_div": 1.6455883389587502, "train_lr": 0.0003900665864903894, "train_wd": 0.33545982814743885, "epoch": 216}
218
+ {"train_loss": 5.871737890678058, "train_entropy": 4.227541248503921, "train_KL_div": 1.644196632204296, "train_lr": 0.000381628956152445, "train_wd": 0.3368995416781991, "epoch": 217}
219
+ {"train_loss": 5.863101538041417, "train_entropy": 4.223289186338917, "train_KL_div": 1.639812328784967, "train_lr": 0.0003732644811703823, "train_wd": 0.3383264358568765, "epoch": 218}
220
+ {"train_loss": 5.857191405326819, "train_entropy": 4.220809852357486, "train_KL_div": 1.6363815285509629, "train_lr": 0.00036497414315242595, "train_wd": 0.33974035420844456, "epoch": 219}
221
+ {"train_loss": 5.849436077949622, "train_entropy": 4.216850670621835, "train_KL_div": 1.6325853908900543, "train_lr": 0.00035675891500650106, "train_wd": 0.3411411416808224, "epoch": 220}
222
+ {"train_loss": 5.843311388429692, "train_entropy": 4.215326313563674, "train_KL_div": 1.6279850637050366, "train_lr": 0.0003486197608260549, "train_wd": 0.3425286446618775, "epoch": 221}
223
+ {"train_loss": 5.839030936896372, "train_entropy": 4.213439176313216, "train_KL_div": 1.6255917389520542, "train_lr": 0.0003405576357769166, "train_wd": 0.3439027109962745, "epoch": 222}
224
+ {"train_loss": 5.830942134062449, "train_entropy": 4.210554832653653, "train_KL_div": 1.620387274274628, "train_lr": 0.0003325734859852069, "train_wd": 0.34526319000215344, "epoch": 223}
225
+ {"train_loss": 5.825088720956295, "train_entropy": 4.208099402255006, "train_KL_div": 1.6169892962840344, "train_lr": 0.0003246682484263037, "train_wd": 0.34660993248766137, "epoch": 224}
226
+ {"train_loss": 5.818030275315118, "train_entropy": 4.204817335692336, "train_KL_div": 1.6132129035431513, "train_lr": 0.0003168428508148856, "train_wd": 0.34794279076730644, "epoch": 225}
227
+ {"train_loss": 5.808955838735536, "train_entropy": 4.201345911867422, "train_KL_div": 1.6076099016159464, "train_lr": 0.00030909821149605694, "train_wd": 0.34926161867816086, "epoch": 226}
228
+ {"train_loss": 5.8025659055923295, "train_entropy": 4.197199286459733, "train_KL_div": 1.6053665961435946, "train_lr": 0.00030143523933758004, "train_wd": 0.35056627159588083, "epoch": 227}
229
+ {"train_loss": 5.793698376269458, "train_entropy": 4.192670385828979, "train_KL_div": 1.601027967177516, "train_lr": 0.0002938548336232136, "train_wd": 0.351856606450572, "epoch": 228}
230
+ {"train_loss": 5.783426511678383, "train_entropy": 4.187715864581741, "train_KL_div": 1.5957106268591732, "train_lr": 0.0002863578839471787, "train_wd": 0.353132481742477, "epoch": 229}
231
+ {"train_loss": 5.776703362032283, "train_entropy": 4.183985500217532, "train_KL_div": 1.5927178395285213, "train_lr": 0.0002789452701097596, "train_wd": 0.3543937575574922, "epoch": 230}
232
+ {"train_loss": 5.768104237761143, "train_entropy": 4.180391667820186, "train_KL_div": 1.5877125481669185, "train_lr": 0.00027161786201405436, "train_wd": 0.3556402955825113, "epoch": 231}
233
+ {"train_loss": 5.759892443720576, "train_entropy": 4.176751979749551, "train_KL_div": 1.583140437265666, "train_lr": 0.0002643765195638927, "train_wd": 0.3568719591205926, "epoch": 232}
234
+ {"train_loss": 5.75135710351854, "train_entropy": 4.173077716029805, "train_KL_div": 1.5782793676086087, "train_lr": 0.0002572220925629175, "train_wd": 0.3580886131059502, "epoch": 233}
235
+ {"train_loss": 5.74134116340503, "train_entropy": 4.168121234618789, "train_KL_div": 1.5732199205197304, "train_lr": 0.00025015542061485795, "train_wd": 0.35929012411876265, "epoch": 234}
236
+ {"train_loss": 5.73250815193716, "train_entropy": 4.163487009221702, "train_KL_div": 1.5690211207270146, "train_lr": 0.00024317733302499914, "train_wd": 0.36047636039980857, "epoch": 235}
237
+ {"train_loss": 5.722429361227128, "train_entropy": 4.158209657902531, "train_KL_div": 1.5642196762618972, "train_lr": 0.00023628864870285911, "train_wd": 0.3616471918649123, "epoch": 236}
238
+ {"train_loss": 5.715903261106172, "train_entropy": 4.15506462684924, "train_KL_div": 1.5608386163898318, "train_lr": 0.0002294901760660861, "train_wd": 0.3628024901192085, "epoch": 237}
239
+ {"train_loss": 5.703981509263948, "train_entropy": 4.148914573766249, "train_KL_div": 1.5550669156175723, "train_lr": 0.00022278271294558666, "train_wd": 0.36394212847122615, "epoch": 238}
240
+ {"train_loss": 5.695237991001775, "train_entropy": 4.144880938110687, "train_KL_div": 1.5503570372156865, "train_lr": 0.0002161670464918976, "train_wd": 0.36506598194677603, "epoch": 239}
241
+ {"train_loss": 5.68346862822509, "train_entropy": 4.1392096897585695, "train_KL_div": 1.5442589234819801, "train_lr": 0.00020964395308280959, "train_wd": 0.3661739273026614, "epoch": 240}
242
+ {"train_loss": 5.6751918788912965, "train_entropy": 4.134816691505728, "train_KL_div": 1.5403751660403397, "train_lr": 0.00020321419823225622, "train_wd": 0.3672658430401871, "epoch": 241}
243
+ {"train_loss": 5.664431006740704, "train_entropy": 4.128975723715995, "train_KL_div": 1.53545526618199, "train_lr": 0.00019687853650047701, "train_wd": 0.3683416094184872, "epoch": 242}
244
+ {"train_loss": 5.652576451821865, "train_entropy": 4.124857353506615, "train_KL_div": 1.5277190912994358, "train_lr": 0.00019063771140546718, "train_wd": 0.3694011084676571, "epoch": 243}
245
+ {"train_loss": 5.643295366177075, "train_entropy": 4.119941368710985, "train_KL_div": 1.523353980825864, "train_lr": 0.00018449245533572195, "train_wd": 0.3704442240016874, "epoch": 244}
246
+ {"train_loss": 5.632327610640217, "train_entropy": 4.115572628166845, "train_KL_div": 1.5167549674530967, "train_lr": 0.00017844348946428778, "train_wd": 0.3714708416312026, "epoch": 245}
247
+ {"train_loss": 5.621810959635688, "train_entropy": 4.10966910368247, "train_KL_div": 1.5121418346556352, "train_lr": 0.00017249152366412976, "train_wd": 0.37248084877601123, "epoch": 246}
248
+ {"train_loss": 5.610179832132219, "train_entropy": 4.105134971755491, "train_KL_div": 1.5050448503949754, "train_lr": 0.00016663725642482362, "train_wd": 0.37347413467744817, "epoch": 247}
249
+ {"train_loss": 5.597649496950978, "train_entropy": 4.099963105720677, "train_KL_div": 1.4976863656922592, "train_lr": 0.0001608813747705871, "train_wd": 0.3744505904105231, "epoch": 248}
250
+ {"train_loss": 5.5896716277471645, "train_entropy": 4.095133028084711, "train_KL_div": 1.4945385914554985, "train_lr": 0.00015522455417965226, "train_wd": 0.37541010889585846, "epoch": 249}
251
+ {"train_loss": 5.578928518828919, "train_entropy": 4.090752164832503, "train_KL_div": 1.488176332889415, "train_lr": 0.00014966745850499615, "train_wd": 0.37635258491144113, "epoch": 250}
252
+ {"train_loss": 5.5653230558387, "train_entropy": 4.085000916803293, "train_KL_div": 1.4803221219663711, "train_lr": 0.00014421073989643563, "train_wd": 0.37727791510415504, "epoch": 251}
253
+ {"train_loss": 5.556099176788025, "train_entropy": 4.080912176415407, "train_KL_div": 1.4751869962393618, "train_lr": 0.00013885503872409274, "train_wd": 0.3781859980011157, "epoch": 252}
254
+ {"train_loss": 5.544552157155806, "train_entropy": 4.0762377154293485, "train_KL_div": 1.4683144386056706, "train_lr": 0.00013360098350324658, "train_wd": 0.37907673402080266, "epoch": 253}
255
+ {"train_loss": 5.533460170102062, "train_entropy": 4.0713386465129044, "train_KL_div": 1.462121501219549, "train_lr": 0.00012844919082057239, "train_wd": 0.37995002548397105, "epoch": 254}
256
+ {"train_loss": 5.523174897467585, "train_entropy": 4.067315061124776, "train_KL_div": 1.4558598209771034, "train_lr": 0.0001234002652617834, "train_wd": 0.3808057766243729, "epoch": 255}
257
+ {"train_loss": 5.510590607837902, "train_entropy": 4.061777040612498, "train_KL_div": 1.4488135580059818, "train_lr": 0.0001184547993406797, "train_wd": 0.3816438935992515, "epoch": 256}
258
+ {"train_loss": 5.4989435093865975, "train_entropy": 4.057814534071634, "train_KL_div": 1.4411289776138645, "train_lr": 0.0001136133734296139, "train_wd": 0.3824642844996354, "epoch": 257}
259
+ {"train_loss": 5.487465196137996, "train_entropy": 4.05207173908643, "train_KL_div": 1.4353934640316464, "train_lr": 0.00010887655569138309, "train_wd": 0.38326685936041804, "epoch": 258}
260
+ {"train_loss": 5.476617317262599, "train_entropy": 4.048170925639897, "train_KL_div": 1.4284463745655773, "train_lr": 0.00010424490201255032, "train_wd": 0.38405153017022187, "epoch": 259}
261
+ {"train_loss": 5.464720789667704, "train_entropy": 4.043207954660022, "train_KL_div": 1.4215128270627784, "train_lr": 9.971895593821063e-05, "train_wd": 0.38481821088104945, "epoch": 260}
262
+ {"train_loss": 5.452156265862554, "train_entropy": 4.037763354065512, "train_KL_div": 1.414392895156817, "train_lr": 9.529924860820295e-05, "train_wd": 0.3855668174177217, "epoch": 261}
263
+ {"train_loss": 5.441671876932125, "train_entropy": 4.034716015930275, "train_KL_div": 1.406955852556667, "train_lr": 9.098629869477826e-05, "train_wd": 0.3862972676870952, "epoch": 262}
264
+ {"train_loss": 5.429566863867686, "train_entropy": 4.029198635586923, "train_KL_div": 1.4003682235519377, "train_lr": 8.678061234173184e-05, "train_wd": 0.3870094815870675, "epoch": 263}
265
+ {"train_loss": 5.417762428164768, "train_entropy": 4.025429022207344, "train_KL_div": 1.3923334057549301, "train_lr": 8.268268310500414e-05, "train_wd": 0.3877033810153572, "epoch": 264}
266
+ {"train_loss": 5.4068270549595026, "train_entropy": 4.021668031037473, "train_KL_div": 1.3851590297347922, "train_lr": 7.869299189476021e-05, "train_wd": 0.38837888987807484, "epoch": 265}
267
+ {"train_loss": 5.3961444920677835, "train_entropy": 4.018166207652584, "train_KL_div": 1.3779782815683754, "train_lr": 7.481200691895316e-05, "train_wd": 0.38903593409805837, "epoch": 266}
268
+ {"train_loss": 5.3831252784466, "train_entropy": 4.013197893147274, "train_KL_div": 1.3699273849896294, "train_lr": 7.104018362837708e-05, "train_wd": 0.3896744416230066, "epoch": 267}
269
+ {"train_loss": 5.371951584383357, "train_entropy": 4.010668040036584, "train_KL_div": 1.3612835452043943, "train_lr": 6.73779646632189e-05, "train_wd": 0.39029434243337324, "epoch": 268}
270
+ {"train_loss": 5.360731474072527, "train_entropy": 4.00675073773455, "train_KL_div": 1.353980731859291, "train_lr": 6.382577980111137e-05, "train_wd": 0.3908955685500501, "epoch": 269}
271
+ {"train_loss": 5.351332967992214, "train_entropy": 4.003147974955759, "train_KL_div": 1.348184989915668, "train_lr": 6.038404590669771e-05, "train_wd": 0.39147805404181696, "epoch": 270}
272
+ {"train_loss": 5.339481295632134, "train_entropy": 3.9998129003768343, "train_KL_div": 1.3396684021281777, "train_lr": 5.7053166882710366e-05, "train_wd": 0.3920417350325767, "epoch": 271}
273
+ {"train_loss": 5.328205978031829, "train_entropy": 3.9972404288492807, "train_KL_div": 1.3309655657155717, "train_lr": 5.3833533622571014e-05, "train_wd": 0.39258654970835427, "epoch": 272}
274
+ {"train_loss": 5.31784350480393, "train_entropy": 3.9940507542029273, "train_KL_div": 1.3237927454791005, "train_lr": 5.072552396451795e-05, "train_wd": 0.39311243832408505, "epoch": 273}
275
+ {"train_loss": 5.306939436329736, "train_entropy": 3.9913786446971953, "train_KL_div": 1.3155608137817882, "train_lr": 4.772950264726465e-05, "train_wd": 0.3936193432101521, "epoch": 274}
276
+ {"train_loss": 5.296510416636173, "train_entropy": 3.988137306259881, "train_KL_div": 1.3083731222400468, "train_lr": 4.4845821267196646e-05, "train_wd": 0.3941072087787241, "epoch": 275}
277
+ {"train_loss": 5.286677013150603, "train_entropy": 3.985823564797187, "train_KL_div": 1.3008534537135459, "train_lr": 4.20748182371092e-05, "train_wd": 0.39457598152984424, "epoch": 276}
278
+ {"train_loss": 5.277100117324735, "train_entropy": 3.9838448463441085, "train_KL_div": 1.293255279122068, "train_lr": 3.9416818746494013e-05, "train_wd": 0.39502561005729664, "epoch": 277}
279
+ {"train_loss": 5.268198249627837, "train_entropy": 3.982085013156124, "train_KL_div": 1.2861132515991907, "train_lr": 3.687213472337624e-05, "train_wd": 0.39545604505424736, "epoch": 278}
280
+ {"train_loss": 5.258950013908551, "train_entropy": 3.9794788965933043, "train_KL_div": 1.2794711347118461, "train_lr": 3.444106479770858e-05, "train_wd": 0.3958672393186477, "epoch": 279}
281
+ {"train_loss": 5.248260313420178, "train_entropy": 3.978048147390977, "train_KL_div": 1.2702121811209441, "train_lr": 3.2123894266325715e-05, "train_wd": 0.39625914775841237, "epoch": 280}
282
+ {"train_loss": 5.241209213062823, "train_entropy": 3.976339482694126, "train_KL_div": 1.2648697338408703, "train_lr": 2.9920895059463095e-05, "train_wd": 0.3966317273963664, "epoch": 281}
283
+ {"train_loss": 5.232404264138281, "train_entropy": 3.9745858036261574, "train_KL_div": 1.2578184683498241, "train_lr": 2.7832325708845335e-05, "train_wd": 0.3969849373749528, "epoch": 282}
284
+ {"train_loss": 5.224994003343925, "train_entropy": 3.974102639823223, "train_KL_div": 1.2508913862000075, "train_lr": 2.5858431317346136e-05, "train_wd": 0.3973187389607159, "epoch": 283}
285
+ {"train_loss": 5.2175875487182735, "train_entropy": 3.972895688004345, "train_KL_div": 1.2446918714460995, "train_lr": 2.3999443530223845e-05, "train_wd": 0.39763309554855275, "epoch": 284}
286
+ {"train_loss": 5.209197056665123, "train_entropy": 3.9718813677962355, "train_KL_div": 1.2373157029064725, "train_lr": 2.2255580507937882e-05, "train_wd": 0.39792797266571783, "epoch": 285}
287
+ {"train_loss": 5.203692179360836, "train_entropy": 3.971021909245865, "train_KL_div": 1.2326702809602998, "train_lr": 2.0627046900545606e-05, "train_wd": 0.3982033379756162, "epoch": 286}
288
+ {"train_loss": 5.195391235925215, "train_entropy": 3.9700617437644734, "train_KL_div": 1.2253294955018899, "train_lr": 1.911403382368657e-05, "train_wd": 0.3984591612813356, "epoch": 287}
289
+ {"train_loss": 5.189558804130478, "train_entropy": 3.9693524303624956, "train_KL_div": 1.2202063918197088, "train_lr": 1.771671883615352e-05, "train_wd": 0.39869541452896834, "epoch": 288}
290
+ {"train_loss": 5.183042350956957, "train_entropy": 3.969050787430968, "train_KL_div": 1.213991584007641, "train_lr": 1.6435265919055812e-05, "train_wd": 0.39891207181068383, "epoch": 289}
291
+ {"train_loss": 5.179306845680225, "train_entropy": 3.9683358702394695, "train_KL_div": 1.2109709927122847, "train_lr": 1.5269825456574793e-05, "train_wd": 0.39910910936756855, "epoch": 290}
292
+ {"train_loss": 5.174180611670256, "train_entropy": 3.968411438804355, "train_KL_div": 1.2057691861649784, "train_lr": 1.4220534218316071e-05, "train_wd": 0.39928650559223694, "epoch": 291}
293
+ {"train_loss": 5.1687768542414, "train_entropy": 3.967860822983497, "train_KL_div": 1.200916047796881, "train_lr": 1.3287515343258813e-05, "train_wd": 0.3994442410311896, "epoch": 292}
294
+ {"train_loss": 5.165959925483838, "train_entropy": 3.967836403613277, "train_KL_div": 1.1981235365215823, "train_lr": 1.2470878325304763e-05, "train_wd": 0.399582298386962, "epoch": 293}
295
+ {"train_loss": 5.16111125977491, "train_entropy": 3.967063317053038, "train_KL_div": 1.1940479593442308, "train_lr": 1.1770719000428801e-05, "train_wd": 0.3997006625200075, "epoch": 294}
296
+ {"train_loss": 5.158519479391767, "train_entropy": 3.9674203815839464, "train_KL_div": 1.1910991035788943, "train_lr": 1.1187119535431991e-05, "train_wd": 0.399799320450365, "epoch": 295}
297
+ {"train_loss": 5.155430122435712, "train_entropy": 3.9672769677343607, "train_KL_div": 1.1881531689712088, "train_lr": 1.0720148418299136e-05, "train_wd": 0.39987826135908094, "epoch": 296}
298
+ {"train_loss": 5.152276425052889, "train_entropy": 3.966756057134159, "train_KL_div": 1.1855203866315402, "train_lr": 1.036986045016131e-05, "train_wd": 0.3999374765893954, "epoch": 297}
299
+ {"train_loss": 5.150887112823321, "train_entropy": 3.966556099250162, "train_KL_div": 1.1843310370147944, "train_lr": 1.0136296738864623e-05, "train_wd": 0.3999769596476907, "epoch": 298}
300
+ {"train_loss": 5.149998925572676, "train_entropy": 3.9675655308530198, "train_KL_div": 1.1824334075601457, "train_lr": 1.0019484694146142e-05, "train_wd": 0.39999670620420386, "epoch": 299, "k-NN": {"10": {"top1": 72.56, "top5": 87.638}, "20": {"top1": 72.434, "top5": 89.24}, "100": {"top1": 70.526, "top5": 90.582}, "200": {"top1": 69.33, "top5": 90.424}}}