-
Notifications
You must be signed in to change notification settings - Fork 1
/
flickr_WSDQH_nbits=24_adaMargin_gamma=1_lambda=0.0001_0003.log
861 lines (861 loc) · 101 KB
/
flickr_WSDQH_nbits=24_adaMargin_gamma=1_lambda=0.0001_0003.log
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
2022-10-20 20:37:34,241 prepare dataset.
2022-10-20 20:37:49,125 prepare data loader.
2022-10-20 20:37:49,125 Initializing DataLoader.
2022-10-20 20:37:49,138 DataLoader already.
2022-10-20 20:37:49,139 prepare model.
2022-10-20 20:37:49,366 Number of semantic embeddings: 1178.
2022-10-20 20:37:57,216 From /data/wangjinpeng/anaconda3/envs/py37torch/lib/python3.7/site-packages/tensorflow_core/python/ops/math_grad.py:1424: where (from tensorflow.python.ops.array_ops) is deprecated and will be removed in a future version.
Instructions for updating:
Use tf.where in 2.0, which has the same broadcast rule as np.where.
2022-10-20 20:38:10,229 begin training.
2022-10-20 20:38:30,418 step [ 1], lr [0.0003000], embedding loss [ 0.8912], quantization loss [ 0.0000], 17.51 sec/batch.
2022-10-20 20:38:33,603 step [ 2], lr [0.0003000], embedding loss [ 0.8806], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:38:36,905 step [ 3], lr [0.0003000], embedding loss [ 0.8660], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:38:40,214 step [ 4], lr [0.0003000], embedding loss [ 0.8579], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:38:43,546 step [ 5], lr [0.0003000], embedding loss [ 0.8475], quantization loss [ 0.0000], 0.58 sec/batch.
2022-10-20 20:38:46,816 step [ 6], lr [0.0003000], embedding loss [ 0.8496], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:38:50,066 step [ 7], lr [0.0003000], embedding loss [ 0.8464], quantization loss [ 0.0000], 0.58 sec/batch.
2022-10-20 20:38:53,373 step [ 8], lr [0.0003000], embedding loss [ 0.8440], quantization loss [ 0.0000], 0.59 sec/batch.
2022-10-20 20:38:56,706 step [ 9], lr [0.0003000], embedding loss [ 0.8473], quantization loss [ 0.0000], 0.60 sec/batch.
2022-10-20 20:39:00,052 step [ 10], lr [0.0003000], embedding loss [ 0.8478], quantization loss [ 0.0000], 0.58 sec/batch.
2022-10-20 20:39:03,417 step [ 11], lr [0.0003000], embedding loss [ 0.8501], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:39:06,758 step [ 12], lr [0.0003000], embedding loss [ 0.8442], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:39:10,057 step [ 13], lr [0.0003000], embedding loss [ 0.8479], quantization loss [ 0.0000], 0.58 sec/batch.
2022-10-20 20:39:13,362 step [ 14], lr [0.0003000], embedding loss [ 0.8395], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:39:16,713 step [ 15], lr [0.0003000], embedding loss [ 0.8403], quantization loss [ 0.0000], 0.59 sec/batch.
2022-10-20 20:39:20,046 step [ 16], lr [0.0003000], embedding loss [ 0.8370], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:39:23,377 step [ 17], lr [0.0003000], embedding loss [ 0.8349], quantization loss [ 0.0000], 0.58 sec/batch.
2022-10-20 20:39:26,648 step [ 18], lr [0.0003000], embedding loss [ 0.8346], quantization loss [ 0.0000], 0.58 sec/batch.
2022-10-20 20:39:29,996 step [ 19], lr [0.0003000], embedding loss [ 0.8295], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:39:33,251 step [ 20], lr [0.0003000], embedding loss [ 0.8314], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:39:36,384 step [ 21], lr [0.0003000], embedding loss [ 0.8386], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:39:39,699 step [ 22], lr [0.0003000], embedding loss [ 0.8428], quantization loss [ 0.0000], 0.58 sec/batch.
2022-10-20 20:39:42,964 step [ 23], lr [0.0003000], embedding loss [ 0.8252], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:39:46,304 step [ 24], lr [0.0003000], embedding loss [ 0.8294], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:39:49,598 step [ 25], lr [0.0003000], embedding loss [ 0.8282], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:39:53,001 step [ 26], lr [0.0003000], embedding loss [ 0.8459], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:39:56,329 step [ 27], lr [0.0003000], embedding loss [ 0.8397], quantization loss [ 0.0000], 0.58 sec/batch.
2022-10-20 20:39:59,636 step [ 28], lr [0.0003000], embedding loss [ 0.8370], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:40:02,929 step [ 29], lr [0.0003000], embedding loss [ 0.8305], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:40:06,240 step [ 30], lr [0.0003000], embedding loss [ 0.8330], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:40:09,532 step [ 31], lr [0.0003000], embedding loss [ 0.8351], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:40:12,885 step [ 32], lr [0.0003000], embedding loss [ 0.8309], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:40:16,144 step [ 33], lr [0.0003000], embedding loss [ 0.8369], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:40:19,457 step [ 34], lr [0.0003000], embedding loss [ 0.8294], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:40:22,717 step [ 35], lr [0.0003000], embedding loss [ 0.8383], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 20:40:26,039 step [ 36], lr [0.0003000], embedding loss [ 0.8317], quantization loss [ 0.0000], 0.58 sec/batch.
2022-10-20 20:40:29,239 step [ 37], lr [0.0003000], embedding loss [ 0.8312], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:40:32,521 step [ 38], lr [0.0003000], embedding loss [ 0.8345], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:40:35,820 step [ 39], lr [0.0003000], embedding loss [ 0.8327], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:40:38,985 step [ 40], lr [0.0003000], embedding loss [ 0.8301], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 20:40:42,280 step [ 41], lr [0.0003000], embedding loss [ 0.8383], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:40:45,598 step [ 42], lr [0.0003000], embedding loss [ 0.8257], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:40:48,928 step [ 43], lr [0.0003000], embedding loss [ 0.8244], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:40:52,214 step [ 44], lr [0.0003000], embedding loss [ 0.8357], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:40:55,511 step [ 45], lr [0.0003000], embedding loss [ 0.8363], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:40:58,866 step [ 46], lr [0.0003000], embedding loss [ 0.8347], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:41:02,149 step [ 47], lr [0.0003000], embedding loss [ 0.8275], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:41:05,463 step [ 48], lr [0.0003000], embedding loss [ 0.8361], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:41:08,779 step [ 49], lr [0.0003000], embedding loss [ 0.8367], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 20:41:12,007 step [ 50], lr [0.0003000], embedding loss [ 0.8258], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:41:15,300 step [ 51], lr [0.0003000], embedding loss [ 0.8355], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:41:18,614 step [ 52], lr [0.0003000], embedding loss [ 0.8308], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:41:21,952 step [ 53], lr [0.0003000], embedding loss [ 0.8297], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:41:25,279 step [ 54], lr [0.0003000], embedding loss [ 0.8376], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:41:28,532 step [ 55], lr [0.0003000], embedding loss [ 0.8324], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:41:31,833 step [ 56], lr [0.0003000], embedding loss [ 0.8249], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:41:35,164 step [ 57], lr [0.0003000], embedding loss [ 0.8197], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:41:38,460 step [ 58], lr [0.0003000], embedding loss [ 0.8283], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:41:41,764 step [ 59], lr [0.0003000], embedding loss [ 0.8253], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:41:45,065 step [ 60], lr [0.0003000], embedding loss [ 0.8320], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 20:41:48,295 step [ 61], lr [0.0003000], embedding loss [ 0.8322], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:41:51,586 step [ 62], lr [0.0003000], embedding loss [ 0.8336], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:41:55,153 step [ 63], lr [0.0003000], embedding loss [ 0.8357], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:41:58,506 step [ 64], lr [0.0003000], embedding loss [ 0.8261], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:42:01,817 step [ 65], lr [0.0003000], embedding loss [ 0.8303], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:42:05,180 step [ 66], lr [0.0003000], embedding loss [ 0.8395], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:42:08,512 step [ 67], lr [0.0003000], embedding loss [ 0.8381], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:42:11,846 step [ 68], lr [0.0003000], embedding loss [ 0.8293], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:42:15,197 step [ 69], lr [0.0003000], embedding loss [ 0.8328], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:42:18,600 step [ 70], lr [0.0003000], embedding loss [ 0.8371], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:42:22,005 step [ 71], lr [0.0003000], embedding loss [ 0.8321], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:42:25,357 step [ 72], lr [0.0003000], embedding loss [ 0.8279], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:42:27,230 step [ 73], lr [0.0003000], embedding loss [ 0.8376], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:42:29,126 step [ 74], lr [0.0003000], embedding loss [ 0.8318], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:42:31,057 step [ 75], lr [0.0003000], embedding loss [ 0.8279], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:42:32,994 step [ 76], lr [0.0003000], embedding loss [ 0.8347], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:42:34,854 step [ 77], lr [0.0003000], embedding loss [ 0.8231], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:42:36,773 step [ 78], lr [0.0003000], embedding loss [ 0.8267], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:42:38,665 step [ 79], lr [0.0003000], embedding loss [ 0.8312], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:42:40,579 step [ 80], lr [0.0003000], embedding loss [ 0.8254], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:42:42,475 step [ 81], lr [0.0003000], embedding loss [ 0.8205], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:42:44,382 step [ 82], lr [0.0003000], embedding loss [ 0.8277], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:42:46,248 step [ 83], lr [0.0003000], embedding loss [ 0.8252], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:42:48,124 step [ 84], lr [0.0003000], embedding loss [ 0.8253], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:42:50,002 step [ 85], lr [0.0003000], embedding loss [ 0.8318], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 20:42:51,845 step [ 86], lr [0.0003000], embedding loss [ 0.8303], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:42:53,672 step [ 87], lr [0.0003000], embedding loss [ 0.8290], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:42:55,525 step [ 88], lr [0.0003000], embedding loss [ 0.8249], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:42:57,378 step [ 89], lr [0.0003000], embedding loss [ 0.8255], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 20:42:59,207 step [ 90], lr [0.0003000], embedding loss [ 0.8268], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:43:01,066 step [ 91], lr [0.0003000], embedding loss [ 0.8267], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:43:02,947 step [ 92], lr [0.0003000], embedding loss [ 0.8210], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:43:04,832 step [ 93], lr [0.0003000], embedding loss [ 0.8254], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:43:06,739 step [ 94], lr [0.0003000], embedding loss [ 0.8223], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:43:08,717 step [ 95], lr [0.0003000], embedding loss [ 0.8321], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:43:10,615 step [ 96], lr [0.0003000], embedding loss [ 0.8330], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:43:12,566 step [ 97], lr [0.0003000], embedding loss [ 0.8298], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:43:14,483 step [ 98], lr [0.0003000], embedding loss [ 0.8226], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:43:16,338 step [ 99], lr [0.0003000], embedding loss [ 0.8275], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:43:18,186 step [ 100], lr [0.0003000], embedding loss [ 0.8300], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:43:20,091 step [ 101], lr [0.0003000], embedding loss [ 0.8250], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:43:21,971 step [ 102], lr [0.0003000], embedding loss [ 0.8302], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:43:23,842 step [ 103], lr [0.0003000], embedding loss [ 0.8303], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:43:25,585 step [ 104], lr [0.0003000], embedding loss [ 0.8238], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 20:43:27,499 step [ 105], lr [0.0003000], embedding loss [ 0.8257], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:43:29,383 step [ 106], lr [0.0003000], embedding loss [ 0.8187], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:43:31,246 step [ 107], lr [0.0003000], embedding loss [ 0.8251], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:43:33,106 step [ 108], lr [0.0003000], embedding loss [ 0.8179], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:43:34,888 step [ 109], lr [0.0003000], embedding loss [ 0.8278], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 20:43:36,748 step [ 110], lr [0.0003000], embedding loss [ 0.8300], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:43:38,694 step [ 111], lr [0.0003000], embedding loss [ 0.8144], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:43:40,594 step [ 112], lr [0.0003000], embedding loss [ 0.8355], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:43:42,482 step [ 113], lr [0.0003000], embedding loss [ 0.8264], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:43:44,342 step [ 114], lr [0.0003000], embedding loss [ 0.8305], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 20:43:46,200 step [ 115], lr [0.0003000], embedding loss [ 0.8295], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 20:43:48,082 step [ 116], lr [0.0003000], embedding loss [ 0.8285], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 20:43:49,985 step [ 117], lr [0.0003000], embedding loss [ 0.8264], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:43:51,822 step [ 118], lr [0.0003000], embedding loss [ 0.8353], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:43:53,726 step [ 119], lr [0.0003000], embedding loss [ 0.8349], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:43:55,642 step [ 120], lr [0.0003000], embedding loss [ 0.8232], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:43:57,608 step [ 121], lr [0.0003000], embedding loss [ 0.8229], quantization loss [ 0.0000], 0.58 sec/batch.
2022-10-20 20:43:59,505 step [ 122], lr [0.0003000], embedding loss [ 0.8286], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:44:01,481 step [ 123], lr [0.0003000], embedding loss [ 0.8234], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:44:03,382 step [ 124], lr [0.0003000], embedding loss [ 0.8278], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:44:05,317 step [ 125], lr [0.0003000], embedding loss [ 0.8169], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:44:07,196 step [ 126], lr [0.0003000], embedding loss [ 0.8298], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:44:09,168 step [ 127], lr [0.0003000], embedding loss [ 0.8341], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:44:11,076 step [ 128], lr [0.0003000], embedding loss [ 0.8225], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:44:13,031 step [ 129], lr [0.0003000], embedding loss [ 0.8242], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:44:14,942 step [ 130], lr [0.0003000], embedding loss [ 0.8173], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:44:16,899 step [ 131], lr [0.0003000], embedding loss [ 0.8224], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:44:18,847 step [ 132], lr [0.0003000], embedding loss [ 0.8197], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:44:20,788 step [ 133], lr [0.0003000], embedding loss [ 0.8299], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:44:22,624 step [ 134], lr [0.0003000], embedding loss [ 0.8210], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:44:24,601 step [ 135], lr [0.0003000], embedding loss [ 0.8226], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:44:26,519 step [ 136], lr [0.0003000], embedding loss [ 0.8181], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:44:28,434 step [ 137], lr [0.0003000], embedding loss [ 0.8260], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:44:30,391 step [ 138], lr [0.0003000], embedding loss [ 0.8207], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:44:32,309 step [ 139], lr [0.0003000], embedding loss [ 0.8324], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:44:34,190 step [ 140], lr [0.0003000], embedding loss [ 0.8241], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:44:36,149 step [ 141], lr [0.0003000], embedding loss [ 0.8230], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:44:37,990 step [ 142], lr [0.0003000], embedding loss [ 0.8272], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:44:39,946 step [ 143], lr [0.0003000], embedding loss [ 0.8262], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:45:01,884 step [ 144], lr [0.0003000], embedding loss [ 0.8177], quantization loss [ 0.0000], 1.66 sec/batch.
2022-10-20 20:45:05,196 step [ 145], lr [0.0003000], embedding loss [ 0.8261], quantization loss [ 0.0000], 2.18 sec/batch.
2022-10-20 20:45:07,857 step [ 146], lr [0.0003000], embedding loss [ 0.8248], quantization loss [ 0.0000], 1.57 sec/batch.
2022-10-20 20:45:10,642 step [ 147], lr [0.0003000], embedding loss [ 0.8206], quantization loss [ 0.0000], 1.65 sec/batch.
2022-10-20 20:45:10,643 initialize centers iter(1/1).
2022-10-20 20:45:17,422 finish center initialization, duration: 6.78 sec.
2022-10-20 20:45:17,422 update codes and centers iter(1/1).
2022-10-20 20:45:23,856 number of update_code wrong: 71.
2022-10-20 20:45:27,093 non zero codewords: 768.
2022-10-20 20:45:27,094 finish center update, duration: 9.67 sec.
2022-10-20 20:45:29,681 step [ 148], lr [0.0003000], embedding loss [ 0.8206], quantization loss [ 0.1407], 1.47 sec/batch.
2022-10-20 20:45:32,964 step [ 149], lr [0.0003000], embedding loss [ 0.8237], quantization loss [ 0.2303], 2.16 sec/batch.
2022-10-20 20:45:35,743 step [ 150], lr [0.0003000], embedding loss [ 0.8168], quantization loss [ 0.1344], 1.67 sec/batch.
2022-10-20 20:45:38,505 step [ 151], lr [0.0003000], embedding loss [ 0.8218], quantization loss [ 0.1454], 1.62 sec/batch.
2022-10-20 20:45:41,808 step [ 152], lr [0.0003000], embedding loss [ 0.8205], quantization loss [ 0.1229], 2.16 sec/batch.
2022-10-20 20:45:44,538 step [ 153], lr [0.0003000], embedding loss [ 0.8352], quantization loss [ 0.1249], 1.62 sec/batch.
2022-10-20 20:45:47,305 step [ 154], lr [0.0003000], embedding loss [ 0.8273], quantization loss [ 0.1047], 1.60 sec/batch.
2022-10-20 20:45:50,100 step [ 155], lr [0.0003000], embedding loss [ 0.8276], quantization loss [ 0.1084], 1.66 sec/batch.
2022-10-20 20:45:53,332 step [ 156], lr [0.0003000], embedding loss [ 0.8328], quantization loss [ 0.1095], 2.13 sec/batch.
2022-10-20 20:45:56,035 step [ 157], lr [0.0003000], embedding loss [ 0.8276], quantization loss [ 0.1063], 1.58 sec/batch.
2022-10-20 20:45:58,776 step [ 158], lr [0.0003000], embedding loss [ 0.8344], quantization loss [ 0.1159], 1.60 sec/batch.
2022-10-20 20:46:02,086 step [ 159], lr [0.0003000], embedding loss [ 0.8271], quantization loss [ 0.1036], 2.16 sec/batch.
2022-10-20 20:46:04,909 step [ 160], lr [0.0003000], embedding loss [ 0.8264], quantization loss [ 0.1078], 1.68 sec/batch.
2022-10-20 20:46:07,689 step [ 161], lr [0.0003000], embedding loss [ 0.8184], quantization loss [ 0.0861], 1.64 sec/batch.
2022-10-20 20:46:10,392 step [ 162], lr [0.0003000], embedding loss [ 0.8305], quantization loss [ 0.0877], 1.61 sec/batch.
2022-10-20 20:46:13,721 step [ 163], lr [0.0003000], embedding loss [ 0.8282], quantization loss [ 0.0989], 2.18 sec/batch.
2022-10-20 20:46:16,578 step [ 164], lr [0.0003000], embedding loss [ 0.8185], quantization loss [ 0.1113], 1.68 sec/batch.
2022-10-20 20:46:19,250 step [ 165], lr [0.0003000], embedding loss [ 0.8184], quantization loss [ 0.1061], 1.54 sec/batch.
2022-10-20 20:46:21,967 step [ 166], lr [0.0003000], embedding loss [ 0.8225], quantization loss [ 0.0799], 1.62 sec/batch.
2022-10-20 20:46:25,182 step [ 167], lr [0.0003000], embedding loss [ 0.8271], quantization loss [ 0.0833], 2.11 sec/batch.
2022-10-20 20:46:27,976 step [ 168], lr [0.0003000], embedding loss [ 0.8189], quantization loss [ 0.0979], 1.63 sec/batch.
2022-10-20 20:46:30,758 step [ 169], lr [0.0003000], embedding loss [ 0.8146], quantization loss [ 0.0859], 1.63 sec/batch.
2022-10-20 20:46:34,042 step [ 170], lr [0.0003000], embedding loss [ 0.8207], quantization loss [ 0.1043], 2.12 sec/batch.
2022-10-20 20:46:36,903 step [ 171], lr [0.0003000], embedding loss [ 0.8254], quantization loss [ 0.0930], 1.69 sec/batch.
2022-10-20 20:46:39,678 step [ 172], lr [0.0003000], embedding loss [ 0.8259], quantization loss [ 0.0885], 1.60 sec/batch.
2022-10-20 20:46:42,458 step [ 173], lr [0.0003000], embedding loss [ 0.8191], quantization loss [ 0.0853], 1.64 sec/batch.
2022-10-20 20:46:45,626 step [ 174], lr [0.0003000], embedding loss [ 0.8177], quantization loss [ 0.0814], 2.02 sec/batch.
2022-10-20 20:46:48,474 step [ 175], lr [0.0003000], embedding loss [ 0.8198], quantization loss [ 0.0903], 1.71 sec/batch.
2022-10-20 20:46:51,210 step [ 176], lr [0.0003000], embedding loss [ 0.8232], quantization loss [ 0.0765], 1.59 sec/batch.
2022-10-20 20:46:54,388 step [ 177], lr [0.0003000], embedding loss [ 0.8263], quantization loss [ 0.0867], 2.06 sec/batch.
2022-10-20 20:46:57,200 step [ 178], lr [0.0003000], embedding loss [ 0.8214], quantization loss [ 0.0816], 1.66 sec/batch.
2022-10-20 20:46:59,937 step [ 179], lr [0.0003000], embedding loss [ 0.8256], quantization loss [ 0.0755], 1.61 sec/batch.
2022-10-20 20:47:02,696 step [ 180], lr [0.0003000], embedding loss [ 0.8257], quantization loss [ 0.0800], 1.64 sec/batch.
2022-10-20 20:47:06,008 step [ 181], lr [0.0003000], embedding loss [ 0.8281], quantization loss [ 0.0795], 2.17 sec/batch.
2022-10-20 20:47:08,839 step [ 182], lr [0.0003000], embedding loss [ 0.8206], quantization loss [ 0.0718], 1.69 sec/batch.
2022-10-20 20:47:13,420 step [ 183], lr [0.0003000], embedding loss [ 0.8314], quantization loss [ 0.0837], 2.46 sec/batch.
2022-10-20 20:47:17,356 step [ 184], lr [0.0003000], embedding loss [ 0.8224], quantization loss [ 0.0820], 2.75 sec/batch.
2022-10-20 20:47:20,729 step [ 185], lr [0.0003000], embedding loss [ 0.8292], quantization loss [ 0.0759], 2.22 sec/batch.
2022-10-20 20:47:24,062 step [ 186], lr [0.0003000], embedding loss [ 0.8267], quantization loss [ 0.0767], 2.18 sec/batch.
2022-10-20 20:47:27,002 step [ 187], lr [0.0003000], embedding loss [ 0.8291], quantization loss [ 0.0809], 1.80 sec/batch.
2022-10-20 20:47:28,777 step [ 188], lr [0.0003000], embedding loss [ 0.8158], quantization loss [ 0.0846], 0.55 sec/batch.
2022-10-20 20:47:30,567 step [ 189], lr [0.0003000], embedding loss [ 0.8161], quantization loss [ 0.0907], 0.55 sec/batch.
2022-10-20 20:47:32,367 step [ 190], lr [0.0003000], embedding loss [ 0.8230], quantization loss [ 0.0911], 0.57 sec/batch.
2022-10-20 20:47:34,130 step [ 191], lr [0.0003000], embedding loss [ 0.8271], quantization loss [ 0.0839], 0.56 sec/batch.
2022-10-20 20:47:35,922 step [ 192], lr [0.0003000], embedding loss [ 0.8227], quantization loss [ 0.0738], 0.55 sec/batch.
2022-10-20 20:47:37,693 step [ 193], lr [0.0003000], embedding loss [ 0.8264], quantization loss [ 0.0794], 0.55 sec/batch.
2022-10-20 20:47:39,479 step [ 194], lr [0.0003000], embedding loss [ 0.8206], quantization loss [ 0.0710], 0.56 sec/batch.
2022-10-20 20:47:41,291 step [ 195], lr [0.0003000], embedding loss [ 0.8213], quantization loss [ 0.0840], 0.55 sec/batch.
2022-10-20 20:47:43,101 step [ 196], lr [0.0003000], embedding loss [ 0.8202], quantization loss [ 0.0778], 0.56 sec/batch.
2022-10-20 20:47:44,913 step [ 197], lr [0.0003000], embedding loss [ 0.8316], quantization loss [ 0.0826], 0.56 sec/batch.
2022-10-20 20:47:46,692 step [ 198], lr [0.0003000], embedding loss [ 0.8253], quantization loss [ 0.0706], 0.56 sec/batch.
2022-10-20 20:47:49,512 step [ 199], lr [0.0003000], embedding loss [ 0.8267], quantization loss [ 0.0806], 1.47 sec/batch.
2022-10-20 20:47:52,751 step [ 200], lr [0.0003000], embedding loss [ 0.8212], quantization loss [ 0.0881], 1.99 sec/batch.
2022-10-20 20:47:55,859 step [ 201], lr [0.0003000], embedding loss [ 0.8250], quantization loss [ 0.0861], 1.91 sec/batch.
2022-10-20 20:47:58,464 step [ 202], lr [0.0003000], embedding loss [ 0.8217], quantization loss [ 0.0718], 1.43 sec/batch.
2022-10-20 20:48:01,593 step [ 203], lr [0.0003000], embedding loss [ 0.8196], quantization loss [ 0.0832], 1.97 sec/batch.
2022-10-20 20:48:05,000 step [ 204], lr [0.0003000], embedding loss [ 0.8176], quantization loss [ 0.0738], 2.14 sec/batch.
2022-10-20 20:48:08,177 step [ 205], lr [0.0003000], embedding loss [ 0.8286], quantization loss [ 0.0734], 2.04 sec/batch.
2022-10-20 20:48:11,294 step [ 206], lr [0.0003000], embedding loss [ 0.8284], quantization loss [ 0.0843], 1.94 sec/batch.
2022-10-20 20:48:14,332 step [ 207], lr [0.0003000], embedding loss [ 0.8253], quantization loss [ 0.0756], 1.91 sec/batch.
2022-10-20 20:48:17,602 step [ 208], lr [0.0003000], embedding loss [ 0.8243], quantization loss [ 0.0763], 1.98 sec/batch.
2022-10-20 20:48:20,587 step [ 209], lr [0.0003000], embedding loss [ 0.8271], quantization loss [ 0.0733], 1.82 sec/batch.
2022-10-20 20:48:23,191 step [ 210], lr [0.0003000], embedding loss [ 0.8174], quantization loss [ 0.0825], 1.47 sec/batch.
2022-10-20 20:48:26,195 step [ 211], lr [0.0003000], embedding loss [ 0.8275], quantization loss [ 0.0789], 1.88 sec/batch.
2022-10-20 20:48:29,437 step [ 212], lr [0.0003000], embedding loss [ 0.8183], quantization loss [ 0.0837], 2.04 sec/batch.
2022-10-20 20:48:32,658 step [ 213], lr [0.0003000], embedding loss [ 0.8244], quantization loss [ 0.0798], 2.00 sec/batch.
2022-10-20 20:48:35,923 step [ 214], lr [0.0003000], embedding loss [ 0.8219], quantization loss [ 0.0859], 2.08 sec/batch.
2022-10-20 20:48:38,997 step [ 215], lr [0.0003000], embedding loss [ 0.8243], quantization loss [ 0.0976], 1.89 sec/batch.
2022-10-20 20:48:42,429 step [ 216], lr [0.0003000], embedding loss [ 0.8176], quantization loss [ 0.0877], 2.13 sec/batch.
2022-10-20 20:48:44,845 step [ 217], lr [0.0003000], embedding loss [ 0.8219], quantization loss [ 0.0798], 1.15 sec/batch.
2022-10-20 20:48:47,889 step [ 218], lr [0.0003000], embedding loss [ 0.8172], quantization loss [ 0.0726], 1.94 sec/batch.
2022-10-20 20:48:50,869 step [ 219], lr [0.0003000], embedding loss [ 0.8208], quantization loss [ 0.0738], 1.66 sec/batch.
2022-10-20 20:48:54,133 step [ 220], lr [0.0003000], embedding loss [ 0.8225], quantization loss [ 0.0749], 2.05 sec/batch.
2022-10-20 20:48:56,660 step [ 221], lr [0.0003000], embedding loss [ 0.8279], quantization loss [ 0.0745], 1.12 sec/batch.
2022-10-20 20:48:59,106 step [ 222], lr [0.0003000], embedding loss [ 0.8247], quantization loss [ 0.0712], 1.15 sec/batch.
2022-10-20 20:49:01,971 step [ 223], lr [0.0003000], embedding loss [ 0.8270], quantization loss [ 0.0928], 1.71 sec/batch.
2022-10-20 20:49:04,322 step [ 224], lr [0.0003000], embedding loss [ 0.8291], quantization loss [ 0.0746], 1.14 sec/batch.
2022-10-20 20:49:06,205 step [ 225], lr [0.0003000], embedding loss [ 0.8253], quantization loss [ 0.0833], 0.57 sec/batch.
2022-10-20 20:49:08,416 step [ 226], lr [0.0003000], embedding loss [ 0.8217], quantization loss [ 0.0857], 1.08 sec/batch.
2022-10-20 20:49:10,778 step [ 227], lr [0.0003000], embedding loss [ 0.8306], quantization loss [ 0.0753], 1.13 sec/batch.
2022-10-20 20:49:13,197 step [ 228], lr [0.0003000], embedding loss [ 0.8227], quantization loss [ 0.0695], 1.15 sec/batch.
2022-10-20 20:49:14,992 step [ 229], lr [0.0003000], embedding loss [ 0.8207], quantization loss [ 0.0845], 0.55 sec/batch.
2022-10-20 20:49:16,788 step [ 230], lr [0.0003000], embedding loss [ 0.8201], quantization loss [ 0.0850], 0.58 sec/batch.
2022-10-20 20:49:18,625 step [ 231], lr [0.0003000], embedding loss [ 0.8189], quantization loss [ 0.0754], 0.57 sec/batch.
2022-10-20 20:49:20,483 step [ 232], lr [0.0003000], embedding loss [ 0.8191], quantization loss [ 0.0695], 0.56 sec/batch.
2022-10-20 20:49:22,282 step [ 233], lr [0.0003000], embedding loss [ 0.8169], quantization loss [ 0.0739], 0.57 sec/batch.
2022-10-20 20:49:24,186 step [ 234], lr [0.0003000], embedding loss [ 0.8189], quantization loss [ 0.0670], 0.58 sec/batch.
2022-10-20 20:49:26,034 step [ 235], lr [0.0003000], embedding loss [ 0.8327], quantization loss [ 0.0768], 0.55 sec/batch.
2022-10-20 20:49:27,799 step [ 236], lr [0.0003000], embedding loss [ 0.8347], quantization loss [ 0.0688], 0.57 sec/batch.
2022-10-20 20:49:29,500 step [ 237], lr [0.0003000], embedding loss [ 0.8157], quantization loss [ 0.0651], 0.56 sec/batch.
2022-10-20 20:49:31,272 step [ 238], lr [0.0003000], embedding loss [ 0.8178], quantization loss [ 0.0796], 0.57 sec/batch.
2022-10-20 20:49:33,116 step [ 239], lr [0.0003000], embedding loss [ 0.8222], quantization loss [ 0.0621], 0.56 sec/batch.
2022-10-20 20:49:34,878 step [ 240], lr [0.0003000], embedding loss [ 0.8159], quantization loss [ 0.0746], 0.57 sec/batch.
2022-10-20 20:49:36,665 step [ 241], lr [0.0003000], embedding loss [ 0.8130], quantization loss [ 0.0758], 0.57 sec/batch.
2022-10-20 20:49:38,400 step [ 242], lr [0.0003000], embedding loss [ 0.8156], quantization loss [ 0.0759], 0.57 sec/batch.
2022-10-20 20:49:40,124 step [ 243], lr [0.0003000], embedding loss [ 0.8228], quantization loss [ 0.0673], 0.54 sec/batch.
2022-10-20 20:49:41,843 step [ 244], lr [0.0003000], embedding loss [ 0.8209], quantization loss [ 0.0622], 0.56 sec/batch.
2022-10-20 20:49:43,667 step [ 245], lr [0.0003000], embedding loss [ 0.8128], quantization loss [ 0.0705], 0.56 sec/batch.
2022-10-20 20:49:45,361 step [ 246], lr [0.0003000], embedding loss [ 0.8208], quantization loss [ 0.0684], 0.56 sec/batch.
2022-10-20 20:49:47,101 step [ 247], lr [0.0003000], embedding loss [ 0.8252], quantization loss [ 0.0814], 0.57 sec/batch.
2022-10-20 20:49:48,982 step [ 248], lr [0.0003000], embedding loss [ 0.8223], quantization loss [ 0.0769], 0.61 sec/batch.
2022-10-20 20:49:50,747 step [ 249], lr [0.0003000], embedding loss [ 0.8172], quantization loss [ 0.0778], 0.56 sec/batch.
2022-10-20 20:49:52,516 step [ 250], lr [0.0003000], embedding loss [ 0.8223], quantization loss [ 0.0702], 0.57 sec/batch.
2022-10-20 20:49:54,857 step [ 251], lr [0.0003000], embedding loss [ 0.8187], quantization loss [ 0.0734], 1.12 sec/batch.
2022-10-20 20:49:57,218 step [ 252], lr [0.0003000], embedding loss [ 0.8190], quantization loss [ 0.0667], 1.14 sec/batch.
2022-10-20 20:49:59,443 step [ 253], lr [0.0003000], embedding loss [ 0.8255], quantization loss [ 0.0787], 1.12 sec/batch.
2022-10-20 20:50:01,808 step [ 254], lr [0.0003000], embedding loss [ 0.8264], quantization loss [ 0.0831], 1.04 sec/batch.
2022-10-20 20:50:03,549 step [ 255], lr [0.0003000], embedding loss [ 0.8175], quantization loss [ 0.0789], 0.57 sec/batch.
2022-10-20 20:50:05,908 step [ 256], lr [0.0003000], embedding loss [ 0.8230], quantization loss [ 0.0898], 1.15 sec/batch.
2022-10-20 20:50:08,777 step [ 257], lr [0.0003000], embedding loss [ 0.8235], quantization loss [ 0.0737], 1.67 sec/batch.
2022-10-20 20:50:11,232 step [ 258], lr [0.0003000], embedding loss [ 0.8155], quantization loss [ 0.0876], 1.19 sec/batch.
2022-10-20 20:50:14,074 step [ 259], lr [0.0003000], embedding loss [ 0.8330], quantization loss [ 0.0713], 1.67 sec/batch.
2022-10-20 20:50:16,851 step [ 260], lr [0.0003000], embedding loss [ 0.8224], quantization loss [ 0.0708], 1.64 sec/batch.
2022-10-20 20:50:19,522 step [ 261], lr [0.0003000], embedding loss [ 0.8267], quantization loss [ 0.0666], 1.56 sec/batch.
2022-10-20 20:50:22,174 step [ 262], lr [0.0003000], embedding loss [ 0.8191], quantization loss [ 0.0817], 1.54 sec/batch.
2022-10-20 20:50:24,957 step [ 263], lr [0.0003000], embedding loss [ 0.8260], quantization loss [ 0.0782], 1.58 sec/batch.
2022-10-20 20:50:27,824 step [ 264], lr [0.0003000], embedding loss [ 0.8231], quantization loss [ 0.0711], 1.66 sec/batch.
2022-10-20 20:50:29,646 step [ 265], lr [0.0003000], embedding loss [ 0.8244], quantization loss [ 0.0663], 0.58 sec/batch.
2022-10-20 20:50:31,520 step [ 266], lr [0.0003000], embedding loss [ 0.8273], quantization loss [ 0.0682], 0.56 sec/batch.
2022-10-20 20:50:33,334 step [ 267], lr [0.0003000], embedding loss [ 0.8272], quantization loss [ 0.0735], 0.57 sec/batch.
2022-10-20 20:50:36,529 step [ 268], lr [0.0003000], embedding loss [ 0.8124], quantization loss [ 0.0732], 2.04 sec/batch.
2022-10-20 20:50:39,462 step [ 269], lr [0.0003000], embedding loss [ 0.8294], quantization loss [ 0.0788], 1.66 sec/batch.
2022-10-20 20:50:41,246 step [ 270], lr [0.0003000], embedding loss [ 0.8229], quantization loss [ 0.0713], 0.57 sec/batch.
2022-10-20 20:50:44,556 step [ 271], lr [0.0003000], embedding loss [ 0.8205], quantization loss [ 0.0722], 2.06 sec/batch.
2022-10-20 20:50:46,958 step [ 272], lr [0.0003000], embedding loss [ 0.8258], quantization loss [ 0.0715], 1.09 sec/batch.
2022-10-20 20:50:49,667 step [ 273], lr [0.0003000], embedding loss [ 0.8253], quantization loss [ 0.0616], 1.57 sec/batch.
2022-10-20 20:50:52,108 step [ 274], lr [0.0003000], embedding loss [ 0.8188], quantization loss [ 0.0673], 1.13 sec/batch.
2022-10-20 20:50:54,552 step [ 275], lr [0.0003000], embedding loss [ 0.8253], quantization loss [ 0.0661], 1.19 sec/batch.
2022-10-20 20:50:56,311 step [ 276], lr [0.0003000], embedding loss [ 0.8145], quantization loss [ 0.0714], 0.56 sec/batch.
2022-10-20 20:50:58,099 step [ 277], lr [0.0003000], embedding loss [ 0.8310], quantization loss [ 0.0825], 0.59 sec/batch.
2022-10-20 20:51:00,545 step [ 278], lr [0.0003000], embedding loss [ 0.8233], quantization loss [ 0.0659], 1.15 sec/batch.
2022-10-20 20:51:02,259 step [ 279], lr [0.0003000], embedding loss [ 0.8223], quantization loss [ 0.0785], 0.56 sec/batch.
2022-10-20 20:51:04,481 step [ 280], lr [0.0003000], embedding loss [ 0.8253], quantization loss [ 0.0751], 1.06 sec/batch.
2022-10-20 20:51:06,176 step [ 281], lr [0.0003000], embedding loss [ 0.8306], quantization loss [ 0.0656], 0.57 sec/batch.
2022-10-20 20:51:08,456 step [ 282], lr [0.0003000], embedding loss [ 0.8246], quantization loss [ 0.0743], 1.07 sec/batch.
2022-10-20 20:51:10,809 step [ 283], lr [0.0003000], embedding loss [ 0.8176], quantization loss [ 0.0775], 1.03 sec/batch.
2022-10-20 20:51:13,215 step [ 284], lr [0.0003000], embedding loss [ 0.8139], quantization loss [ 0.0884], 1.17 sec/batch.
2022-10-20 20:51:15,590 step [ 285], lr [0.0003000], embedding loss [ 0.8214], quantization loss [ 0.0774], 1.17 sec/batch.
2022-10-20 20:51:18,246 step [ 286], lr [0.0003000], embedding loss [ 0.8200], quantization loss [ 0.0800], 1.43 sec/batch.
2022-10-20 20:51:20,191 step [ 287], lr [0.0003000], embedding loss [ 0.8342], quantization loss [ 0.0796], 0.57 sec/batch.
2022-10-20 20:51:22,780 step [ 288], lr [0.0003000], embedding loss [ 0.8176], quantization loss [ 0.0700], 1.32 sec/batch.
2022-10-20 20:51:24,705 step [ 289], lr [0.0003000], embedding loss [ 0.8227], quantization loss [ 0.0746], 0.70 sec/batch.
2022-10-20 20:51:27,159 step [ 290], lr [0.0003000], embedding loss [ 0.8281], quantization loss [ 0.0684], 1.20 sec/batch.
2022-10-20 20:51:29,605 step [ 291], lr [0.0003000], embedding loss [ 0.8166], quantization loss [ 0.0650], 1.22 sec/batch.
2022-10-20 20:51:32,009 step [ 292], lr [0.0003000], embedding loss [ 0.8331], quantization loss [ 0.0693], 1.14 sec/batch.
2022-10-20 20:51:33,887 step [ 293], lr [0.0003000], embedding loss [ 0.8179], quantization loss [ 0.0706], 0.57 sec/batch.
2022-10-20 20:51:33,887 update codes and centers iter(1/1).
2022-10-20 20:51:39,945 number of update_code wrong: 0.
2022-10-20 20:51:42,679 non zero codewords: 768.
2022-10-20 20:51:42,679 finish center update, duration: 8.79 sec.
2022-10-20 20:51:44,393 step [ 294], lr [0.0003000], embedding loss [ 0.8236], quantization loss [ 0.0235], 0.57 sec/batch.
2022-10-20 20:51:46,109 step [ 295], lr [0.0003000], embedding loss [ 0.8278], quantization loss [ 0.0192], 0.55 sec/batch.
2022-10-20 20:51:47,767 step [ 296], lr [0.0003000], embedding loss [ 0.8222], quantization loss [ 0.0201], 0.55 sec/batch.
2022-10-20 20:51:50,095 step [ 297], lr [0.0003000], embedding loss [ 0.8177], quantization loss [ 0.0229], 1.13 sec/batch.
2022-10-20 20:51:52,412 step [ 298], lr [0.0003000], embedding loss [ 0.8193], quantization loss [ 0.0220], 1.12 sec/batch.
2022-10-20 20:51:54,796 step [ 299], lr [0.0003000], embedding loss [ 0.8280], quantization loss [ 0.0209], 1.16 sec/batch.
2022-10-20 20:51:57,093 step [ 300], lr [0.0003000], embedding loss [ 0.8272], quantization loss [ 0.0204], 1.15 sec/batch.
2022-10-20 20:51:59,828 step [ 301], lr [0.0001500], embedding loss [ 0.8362], quantization loss [ 0.0193], 1.59 sec/batch.
2022-10-20 20:52:02,760 step [ 302], lr [0.0001500], embedding loss [ 0.8189], quantization loss [ 0.0214], 1.67 sec/batch.
2022-10-20 20:52:05,640 step [ 303], lr [0.0001500], embedding loss [ 0.8132], quantization loss [ 0.0190], 1.61 sec/batch.
2022-10-20 20:52:07,482 step [ 304], lr [0.0001500], embedding loss [ 0.8286], quantization loss [ 0.0204], 0.57 sec/batch.
2022-10-20 20:52:09,287 step [ 305], lr [0.0001500], embedding loss [ 0.8259], quantization loss [ 0.0203], 0.55 sec/batch.
2022-10-20 20:52:11,104 step [ 306], lr [0.0001500], embedding loss [ 0.8216], quantization loss [ 0.0220], 0.57 sec/batch.
2022-10-20 20:52:13,564 step [ 307], lr [0.0001500], embedding loss [ 0.8243], quantization loss [ 0.0176], 1.11 sec/batch.
2022-10-20 20:52:15,894 step [ 308], lr [0.0001500], embedding loss [ 0.8177], quantization loss [ 0.0193], 1.14 sec/batch.
2022-10-20 20:52:18,254 step [ 309], lr [0.0001500], embedding loss [ 0.8226], quantization loss [ 0.0192], 1.16 sec/batch.
2022-10-20 20:52:20,611 step [ 310], lr [0.0001500], embedding loss [ 0.8223], quantization loss [ 0.0192], 1.11 sec/batch.
2022-10-20 20:52:23,307 step [ 311], lr [0.0001500], embedding loss [ 0.8247], quantization loss [ 0.0212], 1.50 sec/batch.
2022-10-20 20:52:26,217 step [ 312], lr [0.0001500], embedding loss [ 0.8257], quantization loss [ 0.0185], 1.67 sec/batch.
2022-10-20 20:52:28,709 step [ 313], lr [0.0001500], embedding loss [ 0.8177], quantization loss [ 0.0192], 1.20 sec/batch.
2022-10-20 20:52:31,595 step [ 314], lr [0.0001500], embedding loss [ 0.8206], quantization loss [ 0.0199], 1.67 sec/batch.
2022-10-20 20:52:35,020 step [ 315], lr [0.0001500], embedding loss [ 0.8208], quantization loss [ 0.0190], 2.23 sec/batch.
2022-10-20 20:52:38,025 step [ 316], lr [0.0001500], embedding loss [ 0.8216], quantization loss [ 0.0193], 1.74 sec/batch.
2022-10-20 20:52:40,951 step [ 317], lr [0.0001500], embedding loss [ 0.8192], quantization loss [ 0.0205], 1.70 sec/batch.
2022-10-20 20:52:44,448 step [ 318], lr [0.0001500], embedding loss [ 0.8254], quantization loss [ 0.0176], 2.24 sec/batch.
2022-10-20 20:52:47,717 step [ 319], lr [0.0001500], embedding loss [ 0.8317], quantization loss [ 0.0188], 2.12 sec/batch.
2022-10-20 20:52:51,074 step [ 320], lr [0.0001500], embedding loss [ 0.8321], quantization loss [ 0.0179], 2.17 sec/batch.
2022-10-20 20:52:54,424 step [ 321], lr [0.0001500], embedding loss [ 0.8210], quantization loss [ 0.0195], 2.15 sec/batch.
2022-10-20 20:52:57,947 step [ 322], lr [0.0001500], embedding loss [ 0.8247], quantization loss [ 0.0185], 2.28 sec/batch.
2022-10-20 20:53:01,327 step [ 323], lr [0.0001500], embedding loss [ 0.8237], quantization loss [ 0.0194], 2.18 sec/batch.
2022-10-20 20:53:04,612 step [ 324], lr [0.0001500], embedding loss [ 0.8302], quantization loss [ 0.0201], 2.08 sec/batch.
2022-10-20 20:53:07,927 step [ 325], lr [0.0001500], embedding loss [ 0.8273], quantization loss [ 0.0185], 2.11 sec/batch.
2022-10-20 20:53:11,387 step [ 326], lr [0.0001500], embedding loss [ 0.8233], quantization loss [ 0.0192], 2.25 sec/batch.
2022-10-20 20:53:14,815 step [ 327], lr [0.0001500], embedding loss [ 0.8214], quantization loss [ 0.0190], 2.21 sec/batch.
2022-10-20 20:53:18,306 step [ 328], lr [0.0001500], embedding loss [ 0.8310], quantization loss [ 0.0190], 2.21 sec/batch.
2022-10-20 20:53:21,737 step [ 329], lr [0.0001500], embedding loss [ 0.8204], quantization loss [ 0.0171], 2.15 sec/batch.
2022-10-20 20:53:25,021 step [ 330], lr [0.0001500], embedding loss [ 0.8228], quantization loss [ 0.0174], 2.13 sec/batch.
2022-10-20 20:53:28,278 step [ 331], lr [0.0001500], embedding loss [ 0.8148], quantization loss [ 0.0178], 2.08 sec/batch.
2022-10-20 20:53:31,715 step [ 332], lr [0.0001500], embedding loss [ 0.8165], quantization loss [ 0.0193], 2.15 sec/batch.
2022-10-20 20:53:35,077 step [ 333], lr [0.0001500], embedding loss [ 0.8194], quantization loss [ 0.0191], 2.14 sec/batch.
2022-10-20 20:53:38,422 step [ 334], lr [0.0001500], embedding loss [ 0.8224], quantization loss [ 0.0176], 2.19 sec/batch.
2022-10-20 20:53:41,745 step [ 335], lr [0.0001500], embedding loss [ 0.8284], quantization loss [ 0.0183], 2.12 sec/batch.
2022-10-20 20:53:45,250 step [ 336], lr [0.0001500], embedding loss [ 0.8225], quantization loss [ 0.0179], 2.22 sec/batch.
2022-10-20 20:53:48,521 step [ 337], lr [0.0001500], embedding loss [ 0.8286], quantization loss [ 0.0176], 2.09 sec/batch.
2022-10-20 20:53:50,918 step [ 338], lr [0.0001500], embedding loss [ 0.8183], quantization loss [ 0.0198], 1.20 sec/batch.
2022-10-20 20:53:54,328 step [ 339], lr [0.0001500], embedding loss [ 0.8258], quantization loss [ 0.0190], 2.11 sec/batch.
2022-10-20 20:53:57,653 step [ 340], lr [0.0001500], embedding loss [ 0.8201], quantization loss [ 0.0176], 2.05 sec/batch.
2022-10-20 20:54:00,626 step [ 341], lr [0.0001500], embedding loss [ 0.8140], quantization loss [ 0.0187], 1.75 sec/batch.
2022-10-20 20:54:04,121 step [ 342], lr [0.0001500], embedding loss [ 0.8199], quantization loss [ 0.0188], 2.18 sec/batch.
2022-10-20 20:54:07,482 step [ 343], lr [0.0001500], embedding loss [ 0.8152], quantization loss [ 0.0191], 2.15 sec/batch.
2022-10-20 20:54:10,726 step [ 344], lr [0.0001500], embedding loss [ 0.8215], quantization loss [ 0.0178], 2.05 sec/batch.
2022-10-20 20:54:14,209 step [ 345], lr [0.0001500], embedding loss [ 0.8197], quantization loss [ 0.0196], 2.18 sec/batch.
2022-10-20 20:54:17,638 step [ 346], lr [0.0001500], embedding loss [ 0.8253], quantization loss [ 0.0186], 2.21 sec/batch.
2022-10-20 20:54:20,949 step [ 347], lr [0.0001500], embedding loss [ 0.8133], quantization loss [ 0.0201], 2.15 sec/batch.
2022-10-20 20:54:24,284 step [ 348], lr [0.0001500], embedding loss [ 0.8286], quantization loss [ 0.0175], 2.07 sec/batch.
2022-10-20 20:54:27,384 step [ 349], lr [0.0001500], embedding loss [ 0.8176], quantization loss [ 0.0205], 1.83 sec/batch.
2022-10-20 20:54:30,752 step [ 350], lr [0.0001500], embedding loss [ 0.8222], quantization loss [ 0.0198], 2.12 sec/batch.
2022-10-20 20:54:33,994 step [ 351], lr [0.0001500], embedding loss [ 0.8274], quantization loss [ 0.0188], 2.12 sec/batch.
2022-10-20 20:54:37,233 step [ 352], lr [0.0001500], embedding loss [ 0.8253], quantization loss [ 0.0199], 2.09 sec/batch.
2022-10-20 20:54:40,495 step [ 353], lr [0.0001500], embedding loss [ 0.8139], quantization loss [ 0.0175], 2.07 sec/batch.
2022-10-20 20:54:43,993 step [ 354], lr [0.0001500], embedding loss [ 0.8184], quantization loss [ 0.0187], 2.24 sec/batch.
2022-10-20 20:54:47,478 step [ 355], lr [0.0001500], embedding loss [ 0.8246], quantization loss [ 0.0185], 2.20 sec/batch.
2022-10-20 20:54:50,911 step [ 356], lr [0.0001500], embedding loss [ 0.8146], quantization loss [ 0.0190], 2.24 sec/batch.
2022-10-20 20:54:54,237 step [ 357], lr [0.0001500], embedding loss [ 0.8158], quantization loss [ 0.0183], 2.11 sec/batch.
2022-10-20 20:54:57,768 step [ 358], lr [0.0001500], embedding loss [ 0.8186], quantization loss [ 0.0175], 2.24 sec/batch.
2022-10-20 20:55:01,091 step [ 359], lr [0.0001500], embedding loss [ 0.8187], quantization loss [ 0.0184], 2.16 sec/batch.
2022-10-20 20:55:04,316 step [ 360], lr [0.0001500], embedding loss [ 0.8200], quantization loss [ 0.0170], 2.12 sec/batch.
2022-10-20 20:55:07,638 step [ 361], lr [0.0001500], embedding loss [ 0.8170], quantization loss [ 0.0183], 2.10 sec/batch.
2022-10-20 20:55:10,992 step [ 362], lr [0.0001500], embedding loss [ 0.8166], quantization loss [ 0.0183], 2.14 sec/batch.
2022-10-20 20:55:14,228 step [ 363], lr [0.0001500], embedding loss [ 0.8175], quantization loss [ 0.0171], 2.08 sec/batch.
2022-10-20 20:55:17,708 step [ 364], lr [0.0001500], embedding loss [ 0.8188], quantization loss [ 0.0191], 2.27 sec/batch.
2022-10-20 20:55:21,217 step [ 365], lr [0.0001500], embedding loss [ 0.8249], quantization loss [ 0.0175], 2.25 sec/batch.
2022-10-20 20:55:24,656 step [ 366], lr [0.0001500], embedding loss [ 0.8240], quantization loss [ 0.0170], 2.19 sec/batch.
2022-10-20 20:55:26,425 step [ 367], lr [0.0001500], embedding loss [ 0.8230], quantization loss [ 0.0187], 0.58 sec/batch.
2022-10-20 20:55:29,820 step [ 368], lr [0.0001500], embedding loss [ 0.8316], quantization loss [ 0.0176], 2.16 sec/batch.
2022-10-20 20:55:33,164 step [ 369], lr [0.0001500], embedding loss [ 0.8214], quantization loss [ 0.0178], 2.07 sec/batch.
2022-10-20 20:55:36,718 step [ 370], lr [0.0001500], embedding loss [ 0.8211], quantization loss [ 0.0173], 2.25 sec/batch.
2022-10-20 20:55:40,037 step [ 371], lr [0.0001500], embedding loss [ 0.8265], quantization loss [ 0.0170], 2.10 sec/batch.
2022-10-20 20:55:42,462 step [ 372], lr [0.0001500], embedding loss [ 0.8237], quantization loss [ 0.0172], 1.20 sec/batch.
2022-10-20 20:55:45,896 step [ 373], lr [0.0001500], embedding loss [ 0.8220], quantization loss [ 0.0177], 2.19 sec/batch.
2022-10-20 20:55:49,337 step [ 374], lr [0.0001500], embedding loss [ 0.8267], quantization loss [ 0.0165], 2.16 sec/batch.
2022-10-20 20:55:52,856 step [ 375], lr [0.0001500], embedding loss [ 0.8193], quantization loss [ 0.0181], 2.29 sec/batch.
2022-10-20 20:55:56,351 step [ 376], lr [0.0001500], embedding loss [ 0.8123], quantization loss [ 0.0173], 2.22 sec/batch.
2022-10-20 20:55:59,687 step [ 377], lr [0.0001500], embedding loss [ 0.8335], quantization loss [ 0.0180], 2.12 sec/batch.
2022-10-20 20:56:03,155 step [ 378], lr [0.0001500], embedding loss [ 0.8271], quantization loss [ 0.0182], 2.26 sec/batch.
2022-10-20 20:56:06,558 step [ 379], lr [0.0001500], embedding loss [ 0.8282], quantization loss [ 0.0166], 2.16 sec/batch.
2022-10-20 20:56:10,039 step [ 380], lr [0.0001500], embedding loss [ 0.8316], quantization loss [ 0.0166], 2.23 sec/batch.
2022-10-20 20:56:13,488 step [ 381], lr [0.0001500], embedding loss [ 0.8204], quantization loss [ 0.0189], 2.24 sec/batch.
2022-10-20 20:56:17,028 step [ 382], lr [0.0001500], embedding loss [ 0.8152], quantization loss [ 0.0182], 2.22 sec/batch.
2022-10-20 20:56:20,316 step [ 383], lr [0.0001500], embedding loss [ 0.8215], quantization loss [ 0.0174], 2.11 sec/batch.
2022-10-20 20:56:23,900 step [ 384], lr [0.0001500], embedding loss [ 0.8285], quantization loss [ 0.0182], 2.24 sec/batch.
2022-10-20 20:56:27,248 step [ 385], lr [0.0001500], embedding loss [ 0.8144], quantization loss [ 0.0177], 2.10 sec/batch.
2022-10-20 20:56:30,617 step [ 386], lr [0.0001500], embedding loss [ 0.8207], quantization loss [ 0.0164], 2.15 sec/batch.
2022-10-20 20:56:33,901 step [ 387], lr [0.0001500], embedding loss [ 0.8183], quantization loss [ 0.0179], 2.05 sec/batch.
2022-10-20 20:56:37,302 step [ 388], lr [0.0001500], embedding loss [ 0.8181], quantization loss [ 0.0169], 2.22 sec/batch.
2022-10-20 20:56:40,740 step [ 389], lr [0.0001500], embedding loss [ 0.8142], quantization loss [ 0.0181], 2.24 sec/batch.
2022-10-20 20:56:44,066 step [ 390], lr [0.0001500], embedding loss [ 0.8267], quantization loss [ 0.0192], 2.10 sec/batch.
2022-10-20 20:56:47,514 step [ 391], lr [0.0001500], embedding loss [ 0.8238], quantization loss [ 0.0166], 2.07 sec/batch.
2022-10-20 20:56:50,963 step [ 392], lr [0.0001500], embedding loss [ 0.8275], quantization loss [ 0.0160], 2.20 sec/batch.
2022-10-20 20:56:54,324 step [ 393], lr [0.0001500], embedding loss [ 0.8216], quantization loss [ 0.0160], 2.16 sec/batch.
2022-10-20 20:56:57,784 step [ 394], lr [0.0001500], embedding loss [ 0.8194], quantization loss [ 0.0191], 2.23 sec/batch.
2022-10-20 20:57:01,315 step [ 395], lr [0.0001500], embedding loss [ 0.8303], quantization loss [ 0.0171], 2.21 sec/batch.
2022-10-20 20:57:04,664 step [ 396], lr [0.0001500], embedding loss [ 0.8260], quantization loss [ 0.0186], 2.17 sec/batch.
2022-10-20 20:57:07,992 step [ 397], lr [0.0001500], embedding loss [ 0.8185], quantization loss [ 0.0184], 2.14 sec/batch.
2022-10-20 20:57:11,312 step [ 398], lr [0.0001500], embedding loss [ 0.8135], quantization loss [ 0.0178], 2.10 sec/batch.
2022-10-20 20:57:14,607 step [ 399], lr [0.0001500], embedding loss [ 0.8181], quantization loss [ 0.0161], 2.07 sec/batch.
2022-10-20 20:57:18,044 step [ 400], lr [0.0001500], embedding loss [ 0.8218], quantization loss [ 0.0172], 2.17 sec/batch.
2022-10-20 20:57:21,411 step [ 401], lr [0.0001500], embedding loss [ 0.8250], quantization loss [ 0.0186], 2.15 sec/batch.
2022-10-20 20:57:24,718 step [ 402], lr [0.0001500], embedding loss [ 0.8192], quantization loss [ 0.0191], 2.09 sec/batch.
2022-10-20 20:57:27,990 step [ 403], lr [0.0001500], embedding loss [ 0.8150], quantization loss [ 0.0179], 2.06 sec/batch.
2022-10-20 20:57:31,321 step [ 404], lr [0.0001500], embedding loss [ 0.8243], quantization loss [ 0.0167], 2.14 sec/batch.
2022-10-20 20:57:34,633 step [ 405], lr [0.0001500], embedding loss [ 0.8246], quantization loss [ 0.0165], 2.12 sec/batch.
2022-10-20 20:57:38,011 step [ 406], lr [0.0001500], embedding loss [ 0.8194], quantization loss [ 0.0179], 2.17 sec/batch.
2022-10-20 20:57:41,382 step [ 407], lr [0.0001500], embedding loss [ 0.8201], quantization loss [ 0.0193], 2.18 sec/batch.
2022-10-20 20:57:44,742 step [ 408], lr [0.0001500], embedding loss [ 0.8191], quantization loss [ 0.0179], 2.13 sec/batch.
2022-10-20 20:57:48,044 step [ 409], lr [0.0001500], embedding loss [ 0.8197], quantization loss [ 0.0177], 2.05 sec/batch.
2022-10-20 20:57:51,398 step [ 410], lr [0.0001500], embedding loss [ 0.8162], quantization loss [ 0.0171], 2.14 sec/batch.
2022-10-20 20:57:54,677 step [ 411], lr [0.0001500], embedding loss [ 0.8189], quantization loss [ 0.0173], 2.10 sec/batch.
2022-10-20 20:57:58,025 step [ 412], lr [0.0001500], embedding loss [ 0.8175], quantization loss [ 0.0172], 2.17 sec/batch.
2022-10-20 20:58:01,213 step [ 413], lr [0.0001500], embedding loss [ 0.8232], quantization loss [ 0.0196], 2.04 sec/batch.
2022-10-20 20:58:04,402 step [ 414], lr [0.0001500], embedding loss [ 0.8229], quantization loss [ 0.0174], 2.04 sec/batch.
2022-10-20 20:58:07,644 step [ 415], lr [0.0001500], embedding loss [ 0.8220], quantization loss [ 0.0156], 2.05 sec/batch.
2022-10-20 20:58:10,912 step [ 416], lr [0.0001500], embedding loss [ 0.8232], quantization loss [ 0.0171], 2.07 sec/batch.
2022-10-20 20:58:14,230 step [ 417], lr [0.0001500], embedding loss [ 0.8243], quantization loss [ 0.0179], 2.11 sec/batch.
2022-10-20 20:58:17,473 step [ 418], lr [0.0001500], embedding loss [ 0.8236], quantization loss [ 0.0168], 2.07 sec/batch.
2022-10-20 20:58:20,661 step [ 419], lr [0.0001500], embedding loss [ 0.8215], quantization loss [ 0.0166], 2.04 sec/batch.
2022-10-20 20:58:23,937 step [ 420], lr [0.0001500], embedding loss [ 0.8149], quantization loss [ 0.0189], 2.07 sec/batch.
2022-10-20 20:58:27,163 step [ 421], lr [0.0001500], embedding loss [ 0.8214], quantization loss [ 0.0186], 2.02 sec/batch.
2022-10-20 20:58:30,530 step [ 422], lr [0.0001500], embedding loss [ 0.8154], quantization loss [ 0.0188], 2.17 sec/batch.
2022-10-20 20:58:33,886 step [ 423], lr [0.0001500], embedding loss [ 0.8258], quantization loss [ 0.0190], 2.17 sec/batch.
2022-10-20 20:58:37,237 step [ 424], lr [0.0001500], embedding loss [ 0.8138], quantization loss [ 0.0172], 2.18 sec/batch.
2022-10-20 20:58:40,516 step [ 425], lr [0.0001500], embedding loss [ 0.8163], quantization loss [ 0.0184], 2.11 sec/batch.
2022-10-20 20:58:43,820 step [ 426], lr [0.0001500], embedding loss [ 0.8194], quantization loss [ 0.0167], 2.11 sec/batch.
2022-10-20 20:58:47,137 step [ 427], lr [0.0001500], embedding loss [ 0.8243], quantization loss [ 0.0188], 2.15 sec/batch.
2022-10-20 20:58:50,362 step [ 428], lr [0.0001500], embedding loss [ 0.8265], quantization loss [ 0.0182], 2.07 sec/batch.
2022-10-20 20:58:53,678 step [ 429], lr [0.0001500], embedding loss [ 0.8216], quantization loss [ 0.0181], 2.15 sec/batch.
2022-10-20 20:58:57,044 step [ 430], lr [0.0001500], embedding loss [ 0.8230], quantization loss [ 0.0158], 2.16 sec/batch.
2022-10-20 20:59:00,380 step [ 431], lr [0.0001500], embedding loss [ 0.8145], quantization loss [ 0.0180], 2.16 sec/batch.
2022-10-20 20:59:03,671 step [ 432], lr [0.0001500], embedding loss [ 0.8298], quantization loss [ 0.0183], 2.10 sec/batch.
2022-10-20 20:59:07,024 step [ 433], lr [0.0001500], embedding loss [ 0.8222], quantization loss [ 0.0180], 2.16 sec/batch.
2022-10-20 20:59:10,396 step [ 434], lr [0.0001500], embedding loss [ 0.8189], quantization loss [ 0.0179], 2.16 sec/batch.
2022-10-20 20:59:13,661 step [ 435], lr [0.0001500], embedding loss [ 0.8222], quantization loss [ 0.0180], 2.11 sec/batch.
2022-10-20 20:59:16,939 step [ 436], lr [0.0001500], embedding loss [ 0.8144], quantization loss [ 0.0169], 2.11 sec/batch.
2022-10-20 20:59:20,338 step [ 437], lr [0.0001500], embedding loss [ 0.8134], quantization loss [ 0.0167], 2.17 sec/batch.
2022-10-20 20:59:23,653 step [ 438], lr [0.0001500], embedding loss [ 0.8211], quantization loss [ 0.0171], 2.11 sec/batch.
2022-10-20 20:59:26,953 step [ 439], lr [0.0001500], embedding loss [ 0.8309], quantization loss [ 0.0166], 2.12 sec/batch.
2022-10-20 20:59:26,954 update codes and centers iter(1/1).
2022-10-20 20:59:32,658 number of update_code wrong: 0.
2022-10-20 20:59:35,071 non zero codewords: 768.
2022-10-20 20:59:35,071 finish center update, duration: 8.12 sec.
2022-10-20 20:59:38,364 step [ 440], lr [0.0001500], embedding loss [ 0.8236], quantization loss [ 0.0112], 2.06 sec/batch.
2022-10-20 20:59:41,765 step [ 441], lr [0.0001500], embedding loss [ 0.8285], quantization loss [ 0.0134], 2.17 sec/batch.
2022-10-20 20:59:45,181 step [ 442], lr [0.0001500], embedding loss [ 0.8157], quantization loss [ 0.0128], 2.17 sec/batch.
2022-10-20 20:59:48,620 step [ 443], lr [0.0001500], embedding loss [ 0.8225], quantization loss [ 0.0132], 2.19 sec/batch.
2022-10-20 20:59:51,917 step [ 444], lr [0.0001500], embedding loss [ 0.8245], quantization loss [ 0.0119], 2.08 sec/batch.
2022-10-20 20:59:55,230 step [ 445], lr [0.0001500], embedding loss [ 0.8194], quantization loss [ 0.0134], 2.05 sec/batch.
2022-10-20 20:59:58,622 step [ 446], lr [0.0001500], embedding loss [ 0.8317], quantization loss [ 0.0138], 2.15 sec/batch.
2022-10-20 21:00:02,004 step [ 447], lr [0.0001500], embedding loss [ 0.8198], quantization loss [ 0.0134], 2.18 sec/batch.
2022-10-20 21:00:05,377 step [ 448], lr [0.0001500], embedding loss [ 0.8174], quantization loss [ 0.0129], 2.16 sec/batch.
2022-10-20 21:00:08,604 step [ 449], lr [0.0001500], embedding loss [ 0.8174], quantization loss [ 0.0131], 2.04 sec/batch.
2022-10-20 21:00:11,894 step [ 450], lr [0.0001500], embedding loss [ 0.8218], quantization loss [ 0.0117], 2.07 sec/batch.
2022-10-20 21:00:15,182 step [ 451], lr [0.0001500], embedding loss [ 0.8115], quantization loss [ 0.0121], 2.05 sec/batch.
2022-10-20 21:00:18,539 step [ 452], lr [0.0001500], embedding loss [ 0.8258], quantization loss [ 0.0121], 2.15 sec/batch.
2022-10-20 21:00:21,926 step [ 453], lr [0.0001500], embedding loss [ 0.8210], quantization loss [ 0.0126], 2.20 sec/batch.
2022-10-20 21:00:25,396 step [ 454], lr [0.0001500], embedding loss [ 0.8312], quantization loss [ 0.0128], 2.20 sec/batch.
2022-10-20 21:00:28,704 step [ 455], lr [0.0001500], embedding loss [ 0.8150], quantization loss [ 0.0141], 2.10 sec/batch.
2022-10-20 21:00:30,502 step [ 456], lr [0.0001500], embedding loss [ 0.8161], quantization loss [ 0.0126], 0.61 sec/batch.
2022-10-20 21:00:34,015 step [ 457], lr [0.0001500], embedding loss [ 0.8137], quantization loss [ 0.0127], 2.19 sec/batch.
2022-10-20 21:00:37,489 step [ 458], lr [0.0001500], embedding loss [ 0.8222], quantization loss [ 0.0124], 2.20 sec/batch.
2022-10-20 21:00:40,892 step [ 459], lr [0.0001500], embedding loss [ 0.8237], quantization loss [ 0.0110], 2.17 sec/batch.
2022-10-20 21:00:44,228 step [ 460], lr [0.0001500], embedding loss [ 0.8206], quantization loss [ 0.0135], 2.05 sec/batch.
2022-10-20 21:00:47,495 step [ 461], lr [0.0001500], embedding loss [ 0.8259], quantization loss [ 0.0119], 2.03 sec/batch.
2022-10-20 21:00:50,887 step [ 462], lr [0.0001500], embedding loss [ 0.8197], quantization loss [ 0.0130], 2.17 sec/batch.
2022-10-20 21:00:54,305 step [ 463], lr [0.0001500], embedding loss [ 0.8102], quantization loss [ 0.0117], 2.15 sec/batch.
2022-10-20 21:00:57,773 step [ 464], lr [0.0001500], embedding loss [ 0.8139], quantization loss [ 0.0113], 2.16 sec/batch.
2022-10-20 21:01:01,144 step [ 465], lr [0.0001500], embedding loss [ 0.8181], quantization loss [ 0.0117], 2.16 sec/batch.
2022-10-20 21:01:04,474 step [ 466], lr [0.0001500], embedding loss [ 0.8225], quantization loss [ 0.0129], 2.13 sec/batch.
2022-10-20 21:01:07,790 step [ 467], lr [0.0001500], embedding loss [ 0.8181], quantization loss [ 0.0121], 2.05 sec/batch.
2022-10-20 21:01:11,142 step [ 468], lr [0.0001500], embedding loss [ 0.8251], quantization loss [ 0.0114], 2.12 sec/batch.
2022-10-20 21:01:14,542 step [ 469], lr [0.0001500], embedding loss [ 0.8197], quantization loss [ 0.0126], 2.11 sec/batch.
2022-10-20 21:01:17,945 step [ 470], lr [0.0001500], embedding loss [ 0.8126], quantization loss [ 0.0117], 2.17 sec/batch.
2022-10-20 21:01:21,324 step [ 471], lr [0.0001500], embedding loss [ 0.8220], quantization loss [ 0.0134], 2.17 sec/batch.
2022-10-20 21:01:24,741 step [ 472], lr [0.0001500], embedding loss [ 0.8343], quantization loss [ 0.0115], 2.16 sec/batch.
2022-10-20 21:01:28,039 step [ 473], lr [0.0001500], embedding loss [ 0.8283], quantization loss [ 0.0115], 2.11 sec/batch.
2022-10-20 21:01:31,329 step [ 474], lr [0.0001500], embedding loss [ 0.8322], quantization loss [ 0.0121], 2.06 sec/batch.
2022-10-20 21:01:34,651 step [ 475], lr [0.0001500], embedding loss [ 0.8202], quantization loss [ 0.0117], 2.08 sec/batch.
2022-10-20 21:01:38,009 step [ 476], lr [0.0001500], embedding loss [ 0.8241], quantization loss [ 0.0118], 2.16 sec/batch.
2022-10-20 21:01:41,390 step [ 477], lr [0.0001500], embedding loss [ 0.8186], quantization loss [ 0.0129], 2.15 sec/batch.
2022-10-20 21:01:44,702 step [ 478], lr [0.0001500], embedding loss [ 0.8279], quantization loss [ 0.0117], 2.09 sec/batch.
2022-10-20 21:01:47,859 step [ 479], lr [0.0001500], embedding loss [ 0.8172], quantization loss [ 0.0136], 1.93 sec/batch.
2022-10-20 21:01:51,019 step [ 480], lr [0.0001500], embedding loss [ 0.8325], quantization loss [ 0.0127], 1.96 sec/batch.
2022-10-20 21:01:54,191 step [ 481], lr [0.0001500], embedding loss [ 0.8210], quantization loss [ 0.0132], 1.98 sec/batch.
2022-10-20 21:01:57,428 step [ 482], lr [0.0001500], embedding loss [ 0.8174], quantization loss [ 0.0106], 2.01 sec/batch.
2022-10-20 21:02:00,537 step [ 483], lr [0.0001500], embedding loss [ 0.8200], quantization loss [ 0.0122], 1.90 sec/batch.
2022-10-20 21:02:03,566 step [ 484], lr [0.0001500], embedding loss [ 0.8298], quantization loss [ 0.0120], 1.87 sec/batch.
2022-10-20 21:02:06,655 step [ 485], lr [0.0001500], embedding loss [ 0.8208], quantization loss [ 0.0116], 1.89 sec/batch.
2022-10-20 21:02:09,877 step [ 486], lr [0.0001500], embedding loss [ 0.8227], quantization loss [ 0.0129], 1.99 sec/batch.
2022-10-20 21:02:13,065 step [ 487], lr [0.0001500], embedding loss [ 0.8156], quantization loss [ 0.0121], 1.96 sec/batch.
2022-10-20 21:02:16,249 step [ 488], lr [0.0001500], embedding loss [ 0.8205], quantization loss [ 0.0126], 1.98 sec/batch.
2022-10-20 21:02:19,417 step [ 489], lr [0.0001500], embedding loss [ 0.8240], quantization loss [ 0.0112], 1.93 sec/batch.
2022-10-20 21:02:22,611 step [ 490], lr [0.0001500], embedding loss [ 0.8222], quantization loss [ 0.0121], 1.96 sec/batch.
2022-10-20 21:02:25,806 step [ 491], lr [0.0001500], embedding loss [ 0.8255], quantization loss [ 0.0130], 1.98 sec/batch.
2022-10-20 21:02:28,702 step [ 492], lr [0.0001500], embedding loss [ 0.8149], quantization loss [ 0.0118], 1.61 sec/batch.
2022-10-20 21:02:31,895 step [ 493], lr [0.0001500], embedding loss [ 0.8191], quantization loss [ 0.0114], 1.90 sec/batch.
2022-10-20 21:02:35,103 step [ 494], lr [0.0001500], embedding loss [ 0.8260], quantization loss [ 0.0114], 1.88 sec/batch.
2022-10-20 21:02:38,284 step [ 495], lr [0.0001500], embedding loss [ 0.8177], quantization loss [ 0.0127], 1.89 sec/batch.
2022-10-20 21:02:41,471 step [ 496], lr [0.0001500], embedding loss [ 0.8171], quantization loss [ 0.0120], 1.91 sec/batch.
2022-10-20 21:02:44,612 step [ 497], lr [0.0001500], embedding loss [ 0.8249], quantization loss [ 0.0116], 1.87 sec/batch.
2022-10-20 21:02:47,670 step [ 498], lr [0.0001500], embedding loss [ 0.8181], quantization loss [ 0.0115], 1.85 sec/batch.
2022-10-20 21:02:50,752 step [ 499], lr [0.0001500], embedding loss [ 0.8268], quantization loss [ 0.0123], 1.83 sec/batch.
2022-10-20 21:02:53,876 step [ 500], lr [0.0001500], embedding loss [ 0.8248], quantization loss [ 0.0124], 1.87 sec/batch.
2022-10-20 21:02:57,101 step [ 501], lr [0.0001500], embedding loss [ 0.8265], quantization loss [ 0.0118], 1.90 sec/batch.
2022-10-20 21:03:00,275 step [ 502], lr [0.0001500], embedding loss [ 0.8214], quantization loss [ 0.0119], 1.88 sec/batch.
2022-10-20 21:03:03,167 step [ 503], lr [0.0001500], embedding loss [ 0.8255], quantization loss [ 0.0112], 1.68 sec/batch.
2022-10-20 21:03:06,178 step [ 504], lr [0.0001500], embedding loss [ 0.8232], quantization loss [ 0.0122], 1.80 sec/batch.
2022-10-20 21:03:09,295 step [ 505], lr [0.0001500], embedding loss [ 0.8242], quantization loss [ 0.0114], 1.90 sec/batch.
2022-10-20 21:03:12,393 step [ 506], lr [0.0001500], embedding loss [ 0.8190], quantization loss [ 0.0121], 1.88 sec/batch.
2022-10-20 21:03:15,431 step [ 507], lr [0.0001500], embedding loss [ 0.8251], quantization loss [ 0.0111], 1.79 sec/batch.
2022-10-20 21:03:18,543 step [ 508], lr [0.0001500], embedding loss [ 0.8177], quantization loss [ 0.0117], 1.87 sec/batch.
2022-10-20 21:03:21,639 step [ 509], lr [0.0001500], embedding loss [ 0.8201], quantization loss [ 0.0117], 1.88 sec/batch.
2022-10-20 21:03:24,761 step [ 510], lr [0.0001500], embedding loss [ 0.8209], quantization loss [ 0.0115], 1.88 sec/batch.
2022-10-20 21:03:27,917 step [ 511], lr [0.0001500], embedding loss [ 0.8314], quantization loss [ 0.0132], 1.89 sec/batch.
2022-10-20 21:03:31,032 step [ 512], lr [0.0001500], embedding loss [ 0.8274], quantization loss [ 0.0117], 1.87 sec/batch.
2022-10-20 21:03:34,090 step [ 513], lr [0.0001500], embedding loss [ 0.8284], quantization loss [ 0.0114], 1.86 sec/batch.
2022-10-20 21:03:37,169 step [ 514], lr [0.0001500], embedding loss [ 0.8243], quantization loss [ 0.0113], 1.88 sec/batch.
2022-10-20 21:03:40,263 step [ 515], lr [0.0001500], embedding loss [ 0.8206], quantization loss [ 0.0116], 1.87 sec/batch.
2022-10-20 21:03:43,356 step [ 516], lr [0.0001500], embedding loss [ 0.8196], quantization loss [ 0.0111], 1.87 sec/batch.
2022-10-20 21:03:46,439 step [ 517], lr [0.0001500], embedding loss [ 0.8216], quantization loss [ 0.0113], 1.84 sec/batch.
2022-10-20 21:03:49,562 step [ 518], lr [0.0001500], embedding loss [ 0.8189], quantization loss [ 0.0118], 1.86 sec/batch.
2022-10-20 21:03:52,659 step [ 519], lr [0.0001500], embedding loss [ 0.8250], quantization loss [ 0.0120], 1.87 sec/batch.
2022-10-20 21:03:55,760 step [ 520], lr [0.0001500], embedding loss [ 0.8238], quantization loss [ 0.0117], 1.84 sec/batch.
2022-10-20 21:03:58,822 step [ 521], lr [0.0001500], embedding loss [ 0.8291], quantization loss [ 0.0124], 1.83 sec/batch.
2022-10-20 21:04:01,957 step [ 522], lr [0.0001500], embedding loss [ 0.8293], quantization loss [ 0.0113], 1.92 sec/batch.
2022-10-20 21:04:04,969 step [ 523], lr [0.0001500], embedding loss [ 0.8254], quantization loss [ 0.0118], 1.82 sec/batch.
2022-10-20 21:04:08,084 step [ 524], lr [0.0001500], embedding loss [ 0.8215], quantization loss [ 0.0118], 1.88 sec/batch.
2022-10-20 21:04:11,259 step [ 525], lr [0.0001500], embedding loss [ 0.8238], quantization loss [ 0.0119], 1.88 sec/batch.
2022-10-20 21:04:14,337 step [ 526], lr [0.0001500], embedding loss [ 0.8162], quantization loss [ 0.0117], 1.84 sec/batch.
2022-10-20 21:04:17,457 step [ 527], lr [0.0001500], embedding loss [ 0.8186], quantization loss [ 0.0116], 1.88 sec/batch.
2022-10-20 21:04:20,548 step [ 528], lr [0.0001500], embedding loss [ 0.8166], quantization loss [ 0.0118], 1.88 sec/batch.
2022-10-20 21:04:23,696 step [ 529], lr [0.0001500], embedding loss [ 0.8146], quantization loss [ 0.0120], 1.94 sec/batch.
2022-10-20 21:04:26,830 step [ 530], lr [0.0001500], embedding loss [ 0.8197], quantization loss [ 0.0121], 1.93 sec/batch.
2022-10-20 21:04:29,949 step [ 531], lr [0.0001500], embedding loss [ 0.8151], quantization loss [ 0.0109], 1.91 sec/batch.
2022-10-20 21:04:33,104 step [ 532], lr [0.0001500], embedding loss [ 0.8201], quantization loss [ 0.0111], 1.94 sec/batch.
2022-10-20 21:04:36,235 step [ 533], lr [0.0001500], embedding loss [ 0.8278], quantization loss [ 0.0118], 1.92 sec/batch.
2022-10-20 21:04:39,267 step [ 534], lr [0.0001500], embedding loss [ 0.8260], quantization loss [ 0.0108], 1.81 sec/batch.
2022-10-20 21:04:42,375 step [ 535], lr [0.0001500], embedding loss [ 0.8166], quantization loss [ 0.0121], 1.90 sec/batch.
2022-10-20 21:04:45,542 step [ 536], lr [0.0001500], embedding loss [ 0.8186], quantization loss [ 0.0112], 1.92 sec/batch.
2022-10-20 21:04:48,669 step [ 537], lr [0.0001500], embedding loss [ 0.8251], quantization loss [ 0.0120], 1.88 sec/batch.
2022-10-20 21:04:51,825 step [ 538], lr [0.0001500], embedding loss [ 0.8216], quantization loss [ 0.0124], 1.93 sec/batch.
2022-10-20 21:04:54,903 step [ 539], lr [0.0001500], embedding loss [ 0.8250], quantization loss [ 0.0107], 1.85 sec/batch.
2022-10-20 21:04:58,064 step [ 540], lr [0.0001500], embedding loss [ 0.8150], quantization loss [ 0.0125], 1.93 sec/batch.
2022-10-20 21:05:01,212 step [ 541], lr [0.0001500], embedding loss [ 0.8203], quantization loss [ 0.0124], 1.91 sec/batch.
2022-10-20 21:05:04,399 step [ 542], lr [0.0001500], embedding loss [ 0.8187], quantization loss [ 0.0119], 1.95 sec/batch.
2022-10-20 21:05:07,474 step [ 543], lr [0.0001500], embedding loss [ 0.8149], quantization loss [ 0.0120], 1.82 sec/batch.
2022-10-20 21:05:10,674 step [ 544], lr [0.0001500], embedding loss [ 0.8234], quantization loss [ 0.0120], 1.91 sec/batch.
2022-10-20 21:05:13,821 step [ 545], lr [0.0001500], embedding loss [ 0.8143], quantization loss [ 0.0118], 1.93 sec/batch.
2022-10-20 21:05:16,947 step [ 546], lr [0.0001500], embedding loss [ 0.8197], quantization loss [ 0.0111], 1.91 sec/batch.
2022-10-20 21:05:19,988 step [ 547], lr [0.0001500], embedding loss [ 0.8320], quantization loss [ 0.0114], 1.84 sec/batch.
2022-10-20 21:05:23,090 step [ 548], lr [0.0001500], embedding loss [ 0.8265], quantization loss [ 0.0109], 1.90 sec/batch.
2022-10-20 21:05:26,137 step [ 549], lr [0.0001500], embedding loss [ 0.8212], quantization loss [ 0.0121], 1.84 sec/batch.
2022-10-20 21:05:29,219 step [ 550], lr [0.0001500], embedding loss [ 0.8143], quantization loss [ 0.0120], 1.89 sec/batch.
2022-10-20 21:05:32,381 step [ 551], lr [0.0001500], embedding loss [ 0.8265], quantization loss [ 0.0121], 1.93 sec/batch.
2022-10-20 21:05:35,500 step [ 552], lr [0.0001500], embedding loss [ 0.8190], quantization loss [ 0.0108], 1.88 sec/batch.
2022-10-20 21:05:38,625 step [ 553], lr [0.0001500], embedding loss [ 0.8263], quantization loss [ 0.0117], 1.90 sec/batch.
2022-10-20 21:05:41,727 step [ 554], lr [0.0001500], embedding loss [ 0.8225], quantization loss [ 0.0115], 1.87 sec/batch.
2022-10-20 21:05:44,895 step [ 555], lr [0.0001500], embedding loss [ 0.8226], quantization loss [ 0.0120], 1.92 sec/batch.
2022-10-20 21:05:48,032 step [ 556], lr [0.0001500], embedding loss [ 0.8106], quantization loss [ 0.0114], 1.90 sec/batch.
2022-10-20 21:05:51,187 step [ 557], lr [0.0001500], embedding loss [ 0.8272], quantization loss [ 0.0130], 1.93 sec/batch.
2022-10-20 21:05:54,360 step [ 558], lr [0.0001500], embedding loss [ 0.8217], quantization loss [ 0.0116], 1.92 sec/batch.
2022-10-20 21:05:57,499 step [ 559], lr [0.0001500], embedding loss [ 0.8236], quantization loss [ 0.0114], 1.91 sec/batch.
2022-10-20 21:05:59,874 step [ 560], lr [0.0001500], embedding loss [ 0.8138], quantization loss [ 0.0133], 1.12 sec/batch.
2022-10-20 21:06:02,996 step [ 561], lr [0.0001500], embedding loss [ 0.8230], quantization loss [ 0.0126], 1.87 sec/batch.
2022-10-20 21:06:06,163 step [ 562], lr [0.0001500], embedding loss [ 0.8264], quantization loss [ 0.0115], 1.92 sec/batch.
2022-10-20 21:06:09,356 step [ 563], lr [0.0001500], embedding loss [ 0.8190], quantization loss [ 0.0102], 1.94 sec/batch.
2022-10-20 21:06:12,534 step [ 564], lr [0.0001500], embedding loss [ 0.8220], quantization loss [ 0.0105], 1.92 sec/batch.
2022-10-20 21:06:15,685 step [ 565], lr [0.0001500], embedding loss [ 0.8200], quantization loss [ 0.0119], 1.81 sec/batch.
2022-10-20 21:06:18,862 step [ 566], lr [0.0001500], embedding loss [ 0.8224], quantization loss [ 0.0116], 1.92 sec/batch.
2022-10-20 21:06:22,078 step [ 567], lr [0.0001500], embedding loss [ 0.8251], quantization loss [ 0.0118], 1.94 sec/batch.
2022-10-20 21:06:25,280 step [ 568], lr [0.0001500], embedding loss [ 0.8182], quantization loss [ 0.0114], 1.97 sec/batch.
2022-10-20 21:06:28,473 step [ 569], lr [0.0001500], embedding loss [ 0.8270], quantization loss [ 0.0107], 1.92 sec/batch.
2022-10-20 21:06:31,585 step [ 570], lr [0.0001500], embedding loss [ 0.8226], quantization loss [ 0.0125], 1.86 sec/batch.
2022-10-20 21:06:34,902 step [ 571], lr [0.0001500], embedding loss [ 0.8161], quantization loss [ 0.0113], 2.02 sec/batch.
2022-10-20 21:06:38,129 step [ 572], lr [0.0001500], embedding loss [ 0.8191], quantization loss [ 0.0116], 1.99 sec/batch.
2022-10-20 21:06:39,913 step [ 573], lr [0.0001500], embedding loss [ 0.8151], quantization loss [ 0.0118], 0.58 sec/batch.
2022-10-20 21:06:43,030 step [ 574], lr [0.0001500], embedding loss [ 0.8193], quantization loss [ 0.0119], 1.90 sec/batch.
2022-10-20 21:06:46,280 step [ 575], lr [0.0001500], embedding loss [ 0.8194], quantization loss [ 0.0121], 2.04 sec/batch.
2022-10-20 21:06:49,560 step [ 576], lr [0.0001500], embedding loss [ 0.8234], quantization loss [ 0.0134], 2.08 sec/batch.
2022-10-20 21:06:52,871 step [ 577], lr [0.0001500], embedding loss [ 0.8231], quantization loss [ 0.0126], 2.08 sec/batch.
2022-10-20 21:06:56,190 step [ 578], lr [0.0001500], embedding loss [ 0.8145], quantization loss [ 0.0110], 2.06 sec/batch.
2022-10-20 21:06:59,351 step [ 579], lr [0.0001500], embedding loss [ 0.8154], quantization loss [ 0.0119], 1.95 sec/batch.
2022-10-20 21:07:02,599 step [ 580], lr [0.0001500], embedding loss [ 0.8321], quantization loss [ 0.0119], 2.04 sec/batch.
2022-10-20 21:07:05,902 step [ 581], lr [0.0001500], embedding loss [ 0.8115], quantization loss [ 0.0119], 2.08 sec/batch.
2022-10-20 21:07:09,230 step [ 582], lr [0.0001500], embedding loss [ 0.8207], quantization loss [ 0.0110], 2.07 sec/batch.
2022-10-20 21:07:12,491 step [ 583], lr [0.0001500], embedding loss [ 0.8287], quantization loss [ 0.0114], 2.02 sec/batch.
2022-10-20 21:07:15,656 step [ 584], lr [0.0001500], embedding loss [ 0.8217], quantization loss [ 0.0121], 1.94 sec/batch.
2022-10-20 21:07:18,895 step [ 585], lr [0.0001500], embedding loss [ 0.8168], quantization loss [ 0.0113], 2.04 sec/batch.
2022-10-20 21:07:18,895 update codes and centers iter(1/1).
2022-10-20 21:07:24,599 number of update_code wrong: 0.
2022-10-20 21:07:27,172 non zero codewords: 768.
2022-10-20 21:07:27,172 finish center update, duration: 8.28 sec.
2022-10-20 21:07:30,524 step [ 586], lr [0.0001500], embedding loss [ 0.8241], quantization loss [ 0.0098], 2.16 sec/batch.
2022-10-20 21:07:33,921 step [ 587], lr [0.0001500], embedding loss [ 0.8222], quantization loss [ 0.0103], 2.18 sec/batch.
2022-10-20 21:07:37,393 step [ 588], lr [0.0001500], embedding loss [ 0.8276], quantization loss [ 0.0110], 2.23 sec/batch.
2022-10-20 21:07:40,731 step [ 589], lr [0.0001500], embedding loss [ 0.8182], quantization loss [ 0.0113], 2.10 sec/batch.
2022-10-20 21:07:44,085 step [ 590], lr [0.0001500], embedding loss [ 0.8250], quantization loss [ 0.0120], 2.11 sec/batch.
2022-10-20 21:07:47,513 step [ 591], lr [0.0001500], embedding loss [ 0.8203], quantization loss [ 0.0106], 2.19 sec/batch.
2022-10-20 21:07:50,989 step [ 592], lr [0.0001500], embedding loss [ 0.8244], quantization loss [ 0.0097], 2.22 sec/batch.
2022-10-20 21:07:54,470 step [ 593], lr [0.0001500], embedding loss [ 0.8232], quantization loss [ 0.0099], 2.26 sec/batch.
2022-10-20 21:07:57,935 step [ 594], lr [0.0001500], embedding loss [ 0.8207], quantization loss [ 0.0098], 2.25 sec/batch.
2022-10-20 21:08:01,451 step [ 595], lr [0.0001500], embedding loss [ 0.8202], quantization loss [ 0.0107], 2.26 sec/batch.
2022-10-20 21:08:04,945 step [ 596], lr [0.0001500], embedding loss [ 0.8249], quantization loss [ 0.0101], 2.26 sec/batch.
2022-10-20 21:08:08,483 step [ 597], lr [0.0001500], embedding loss [ 0.8201], quantization loss [ 0.0104], 2.27 sec/batch.
2022-10-20 21:08:11,879 step [ 598], lr [0.0001500], embedding loss [ 0.8142], quantization loss [ 0.0102], 2.12 sec/batch.
2022-10-20 21:08:15,679 step [ 599], lr [0.0001500], embedding loss [ 0.8144], quantization loss [ 0.0097], 2.58 sec/batch.
2022-10-20 21:08:19,190 step [ 600], lr [0.0001500], embedding loss [ 0.8310], quantization loss [ 0.0105], 2.28 sec/batch.
2022-10-20 21:08:22,737 step [ 601], lr [0.0000750], embedding loss [ 0.8296], quantization loss [ 0.0104], 2.28 sec/batch.
2022-10-20 21:08:26,259 step [ 602], lr [0.0000750], embedding loss [ 0.8176], quantization loss [ 0.0101], 2.30 sec/batch.
2022-10-20 21:08:29,805 step [ 603], lr [0.0000750], embedding loss [ 0.8244], quantization loss [ 0.0107], 2.29 sec/batch.
2022-10-20 21:08:33,349 step [ 604], lr [0.0000750], embedding loss [ 0.8186], quantization loss [ 0.0106], 2.27 sec/batch.
2022-10-20 21:08:36,823 step [ 605], lr [0.0000750], embedding loss [ 0.8158], quantization loss [ 0.0104], 2.22 sec/batch.
2022-10-20 21:08:40,323 step [ 606], lr [0.0000750], embedding loss [ 0.8284], quantization loss [ 0.0100], 2.27 sec/batch.
2022-10-20 21:08:44,084 step [ 607], lr [0.0000750], embedding loss [ 0.8261], quantization loss [ 0.0098], 2.50 sec/batch.
2022-10-20 21:08:47,558 step [ 608], lr [0.0000750], embedding loss [ 0.8160], quantization loss [ 0.0098], 2.20 sec/batch.
2022-10-20 21:08:50,994 step [ 609], lr [0.0000750], embedding loss [ 0.8167], quantization loss [ 0.0099], 2.19 sec/batch.
2022-10-20 21:08:54,509 step [ 610], lr [0.0000750], embedding loss [ 0.8193], quantization loss [ 0.0094], 2.25 sec/batch.
2022-10-20 21:08:58,057 step [ 611], lr [0.0000750], embedding loss [ 0.8214], quantization loss [ 0.0099], 2.21 sec/batch.
2022-10-20 21:09:01,616 step [ 612], lr [0.0000750], embedding loss [ 0.8205], quantization loss [ 0.0101], 2.23 sec/batch.
2022-10-20 21:09:05,127 step [ 613], lr [0.0000750], embedding loss [ 0.8222], quantization loss [ 0.0095], 2.21 sec/batch.
2022-10-20 21:09:08,440 step [ 614], lr [0.0000750], embedding loss [ 0.8177], quantization loss [ 0.0109], 2.05 sec/batch.
2022-10-20 21:09:11,900 step [ 615], lr [0.0000750], embedding loss [ 0.8141], quantization loss [ 0.0096], 2.17 sec/batch.
2022-10-20 21:09:15,761 step [ 616], lr [0.0000750], embedding loss [ 0.8172], quantization loss [ 0.0107], 2.61 sec/batch.
2022-10-20 21:09:19,268 step [ 617], lr [0.0000750], embedding loss [ 0.8201], quantization loss [ 0.0108], 2.17 sec/batch.
2022-10-20 21:09:22,686 step [ 618], lr [0.0000750], embedding loss [ 0.8319], quantization loss [ 0.0109], 2.14 sec/batch.
2022-10-20 21:09:26,101 step [ 619], lr [0.0000750], embedding loss [ 0.8209], quantization loss [ 0.0097], 2.18 sec/batch.
2022-10-20 21:09:29,719 step [ 620], lr [0.0000750], embedding loss [ 0.8159], quantization loss [ 0.0104], 2.21 sec/batch.
2022-10-20 21:09:33,207 step [ 621], lr [0.0000750], embedding loss [ 0.8166], quantization loss [ 0.0101], 2.22 sec/batch.
2022-10-20 21:09:36,692 step [ 622], lr [0.0000750], embedding loss [ 0.8144], quantization loss [ 0.0095], 2.21 sec/batch.
2022-10-20 21:09:40,083 step [ 623], lr [0.0000750], embedding loss [ 0.8168], quantization loss [ 0.0101], 2.13 sec/batch.
2022-10-20 21:09:43,535 step [ 624], lr [0.0000750], embedding loss [ 0.8289], quantization loss [ 0.0091], 2.17 sec/batch.
2022-10-20 21:09:47,319 step [ 625], lr [0.0000750], embedding loss [ 0.8183], quantization loss [ 0.0100], 2.54 sec/batch.
2022-10-20 21:09:50,886 step [ 626], lr [0.0000750], embedding loss [ 0.8225], quantization loss [ 0.0096], 2.30 sec/batch.
2022-10-20 21:09:54,387 step [ 627], lr [0.0000750], embedding loss [ 0.8294], quantization loss [ 0.0099], 2.22 sec/batch.
2022-10-20 21:09:57,966 step [ 628], lr [0.0000750], embedding loss [ 0.8214], quantization loss [ 0.0101], 2.33 sec/batch.
2022-10-20 21:10:01,533 step [ 629], lr [0.0000750], embedding loss [ 0.8170], quantization loss [ 0.0102], 2.34 sec/batch.
2022-10-20 21:10:05,114 step [ 630], lr [0.0000750], embedding loss [ 0.8215], quantization loss [ 0.0095], 2.35 sec/batch.
2022-10-20 21:10:08,677 step [ 631], lr [0.0000750], embedding loss [ 0.8264], quantization loss [ 0.0098], 2.33 sec/batch.
2022-10-20 21:10:12,245 step [ 632], lr [0.0000750], embedding loss [ 0.8168], quantization loss [ 0.0097], 2.33 sec/batch.
2022-10-20 21:10:15,818 step [ 633], lr [0.0000750], embedding loss [ 0.8152], quantization loss [ 0.0104], 2.33 sec/batch.
2022-10-20 21:10:19,368 step [ 634], lr [0.0000750], embedding loss [ 0.8289], quantization loss [ 0.0104], 2.31 sec/batch.
2022-10-20 21:10:22,934 step [ 635], lr [0.0000750], embedding loss [ 0.8179], quantization loss [ 0.0095], 2.33 sec/batch.
2022-10-20 21:10:26,525 step [ 636], lr [0.0000750], embedding loss [ 0.8102], quantization loss [ 0.0105], 2.34 sec/batch.
2022-10-20 21:10:30,076 step [ 637], lr [0.0000750], embedding loss [ 0.8212], quantization loss [ 0.0098], 2.31 sec/batch.
2022-10-20 21:10:33,724 step [ 638], lr [0.0000750], embedding loss [ 0.8247], quantization loss [ 0.0099], 2.33 sec/batch.
2022-10-20 21:10:37,277 step [ 639], lr [0.0000750], embedding loss [ 0.8187], quantization loss [ 0.0102], 2.30 sec/batch.
2022-10-20 21:10:40,809 step [ 640], lr [0.0000750], embedding loss [ 0.8242], quantization loss [ 0.0101], 2.29 sec/batch.
2022-10-20 21:10:44,393 step [ 641], lr [0.0000750], embedding loss [ 0.8156], quantization loss [ 0.0098], 2.32 sec/batch.
2022-10-20 21:10:47,951 step [ 642], lr [0.0000750], embedding loss [ 0.8221], quantization loss [ 0.0094], 2.24 sec/batch.
2022-10-20 21:10:51,484 step [ 643], lr [0.0000750], embedding loss [ 0.8194], quantization loss [ 0.0114], 2.20 sec/batch.
2022-10-20 21:10:55,127 step [ 644], lr [0.0000750], embedding loss [ 0.8192], quantization loss [ 0.0092], 2.33 sec/batch.
2022-10-20 21:10:58,691 step [ 645], lr [0.0000750], embedding loss [ 0.8304], quantization loss [ 0.0109], 2.33 sec/batch.
2022-10-20 21:11:02,264 step [ 646], lr [0.0000750], embedding loss [ 0.8220], quantization loss [ 0.0103], 2.34 sec/batch.
2022-10-20 21:11:05,896 step [ 647], lr [0.0000750], embedding loss [ 0.8256], quantization loss [ 0.0109], 2.35 sec/batch.
2022-10-20 21:11:09,449 step [ 648], lr [0.0000750], embedding loss [ 0.8203], quantization loss [ 0.0107], 2.32 sec/batch.
2022-10-20 21:11:13,002 step [ 649], lr [0.0000750], embedding loss [ 0.8206], quantization loss [ 0.0096], 2.34 sec/batch.
2022-10-20 21:11:16,478 step [ 650], lr [0.0000750], embedding loss [ 0.8229], quantization loss [ 0.0112], 2.26 sec/batch.
2022-10-20 21:11:20,050 step [ 651], lr [0.0000750], embedding loss [ 0.8241], quantization loss [ 0.0098], 2.34 sec/batch.
2022-10-20 21:11:23,589 step [ 652], lr [0.0000750], embedding loss [ 0.8197], quantization loss [ 0.0108], 2.30 sec/batch.
2022-10-20 21:11:27,077 step [ 653], lr [0.0000750], embedding loss [ 0.8164], quantization loss [ 0.0102], 2.24 sec/batch.
2022-10-20 21:11:30,653 step [ 654], lr [0.0000750], embedding loss [ 0.8273], quantization loss [ 0.0095], 2.32 sec/batch.
2022-10-20 21:11:34,221 step [ 655], lr [0.0000750], embedding loss [ 0.8232], quantization loss [ 0.0105], 2.33 sec/batch.
2022-10-20 21:11:37,852 step [ 656], lr [0.0000750], embedding loss [ 0.8175], quantization loss [ 0.0102], 2.36 sec/batch.
2022-10-20 21:11:41,465 step [ 657], lr [0.0000750], embedding loss [ 0.8201], quantization loss [ 0.0096], 2.33 sec/batch.
2022-10-20 21:11:45,094 step [ 658], lr [0.0000750], embedding loss [ 0.8269], quantization loss [ 0.0101], 2.36 sec/batch.
2022-10-20 21:11:48,624 step [ 659], lr [0.0000750], embedding loss [ 0.8246], quantization loss [ 0.0099], 2.24 sec/batch.
2022-10-20 21:11:52,209 step [ 660], lr [0.0000750], embedding loss [ 0.8179], quantization loss [ 0.0096], 2.34 sec/batch.
2022-10-20 21:11:55,813 step [ 661], lr [0.0000750], embedding loss [ 0.8224], quantization loss [ 0.0095], 2.37 sec/batch.
2022-10-20 21:11:59,392 step [ 662], lr [0.0000750], embedding loss [ 0.8306], quantization loss [ 0.0099], 2.34 sec/batch.
2022-10-20 21:12:02,827 step [ 663], lr [0.0000750], embedding loss [ 0.8164], quantization loss [ 0.0104], 2.19 sec/batch.
2022-10-20 21:12:06,399 step [ 664], lr [0.0000750], embedding loss [ 0.8258], quantization loss [ 0.0099], 2.33 sec/batch.
2022-10-20 21:12:09,874 step [ 665], lr [0.0000750], embedding loss [ 0.8240], quantization loss [ 0.0098], 2.27 sec/batch.
2022-10-20 21:12:13,441 step [ 666], lr [0.0000750], embedding loss [ 0.8145], quantization loss [ 0.0098], 2.34 sec/batch.
2022-10-20 21:12:16,941 step [ 667], lr [0.0000750], embedding loss [ 0.8222], quantization loss [ 0.0088], 2.27 sec/batch.
2022-10-20 21:12:20,457 step [ 668], lr [0.0000750], embedding loss [ 0.8176], quantization loss [ 0.0090], 2.28 sec/batch.
2022-10-20 21:12:24,031 step [ 669], lr [0.0000750], embedding loss [ 0.8263], quantization loss [ 0.0098], 2.31 sec/batch.
2022-10-20 21:12:27,487 step [ 670], lr [0.0000750], embedding loss [ 0.8225], quantization loss [ 0.0097], 2.20 sec/batch.
2022-10-20 21:12:31,036 step [ 671], lr [0.0000750], embedding loss [ 0.8198], quantization loss [ 0.0098], 2.32 sec/batch.
2022-10-20 21:12:34,652 step [ 672], lr [0.0000750], embedding loss [ 0.8242], quantization loss [ 0.0096], 2.35 sec/batch.
2022-10-20 21:12:38,342 step [ 673], lr [0.0000750], embedding loss [ 0.8220], quantization loss [ 0.0100], 2.37 sec/batch.
2022-10-20 21:12:41,984 step [ 674], lr [0.0000750], embedding loss [ 0.8113], quantization loss [ 0.0097], 2.35 sec/batch.
2022-10-20 21:12:45,551 step [ 675], lr [0.0000750], embedding loss [ 0.8250], quantization loss [ 0.0093], 2.34 sec/batch.
2022-10-20 21:12:49,138 step [ 676], lr [0.0000750], embedding loss [ 0.8095], quantization loss [ 0.0098], 2.35 sec/batch.
2022-10-20 21:12:52,654 step [ 677], lr [0.0000750], embedding loss [ 0.8267], quantization loss [ 0.0091], 2.28 sec/batch.
2022-10-20 21:12:56,195 step [ 678], lr [0.0000750], embedding loss [ 0.8204], quantization loss [ 0.0106], 2.32 sec/batch.
2022-10-20 21:12:59,777 step [ 679], lr [0.0000750], embedding loss [ 0.8348], quantization loss [ 0.0100], 2.34 sec/batch.
2022-10-20 21:13:03,305 step [ 680], lr [0.0000750], embedding loss [ 0.8247], quantization loss [ 0.0101], 2.29 sec/batch.
2022-10-20 21:13:06,929 step [ 681], lr [0.0000750], embedding loss [ 0.8171], quantization loss [ 0.0100], 2.35 sec/batch.
2022-10-20 21:13:10,489 step [ 682], lr [0.0000750], embedding loss [ 0.8206], quantization loss [ 0.0102], 2.31 sec/batch.
2022-10-20 21:13:13,998 step [ 683], lr [0.0000750], embedding loss [ 0.8144], quantization loss [ 0.0099], 2.27 sec/batch.
2022-10-20 21:13:17,610 step [ 684], lr [0.0000750], embedding loss [ 0.8213], quantization loss [ 0.0099], 2.33 sec/batch.
2022-10-20 21:13:21,199 step [ 685], lr [0.0000750], embedding loss [ 0.8200], quantization loss [ 0.0094], 2.34 sec/batch.
2022-10-20 21:13:24,606 step [ 686], lr [0.0000750], embedding loss [ 0.8259], quantization loss [ 0.0103], 2.18 sec/batch.
2022-10-20 21:13:28,105 step [ 687], lr [0.0000750], embedding loss [ 0.8240], quantization loss [ 0.0087], 2.26 sec/batch.
2022-10-20 21:13:31,674 step [ 688], lr [0.0000750], embedding loss [ 0.8220], quantization loss [ 0.0092], 2.30 sec/batch.
2022-10-20 21:13:35,325 step [ 689], lr [0.0000750], embedding loss [ 0.8294], quantization loss [ 0.0109], 2.34 sec/batch.
2022-10-20 21:13:38,917 step [ 690], lr [0.0000750], embedding loss [ 0.8214], quantization loss [ 0.0104], 2.33 sec/batch.
2022-10-20 21:13:42,522 step [ 691], lr [0.0000750], embedding loss [ 0.8189], quantization loss [ 0.0099], 2.33 sec/batch.
2022-10-20 21:13:46,058 step [ 692], lr [0.0000750], embedding loss [ 0.8204], quantization loss [ 0.0091], 2.31 sec/batch.
2022-10-20 21:13:49,597 step [ 693], lr [0.0000750], embedding loss [ 0.8157], quantization loss [ 0.0092], 2.32 sec/batch.
2022-10-20 21:13:53,189 step [ 694], lr [0.0000750], embedding loss [ 0.8219], quantization loss [ 0.0100], 2.35 sec/batch.
2022-10-20 21:13:56,766 step [ 695], lr [0.0000750], embedding loss [ 0.8156], quantization loss [ 0.0097], 2.35 sec/batch.
2022-10-20 21:14:00,352 step [ 696], lr [0.0000750], embedding loss [ 0.8188], quantization loss [ 0.0102], 2.33 sec/batch.
2022-10-20 21:14:03,873 step [ 697], lr [0.0000750], embedding loss [ 0.8216], quantization loss [ 0.0103], 2.27 sec/batch.
2022-10-20 21:14:07,396 step [ 698], lr [0.0000750], embedding loss [ 0.8185], quantization loss [ 0.0100], 2.28 sec/batch.
2022-10-20 21:14:10,988 step [ 699], lr [0.0000750], embedding loss [ 0.8176], quantization loss [ 0.0099], 2.33 sec/batch.
2022-10-20 21:14:14,523 step [ 700], lr [0.0000750], embedding loss [ 0.8222], quantization loss [ 0.0096], 2.29 sec/batch.
2022-10-20 21:14:18,056 step [ 701], lr [0.0000750], embedding loss [ 0.8218], quantization loss [ 0.0094], 2.29 sec/batch.
2022-10-20 21:14:21,670 step [ 702], lr [0.0000750], embedding loss [ 0.8237], quantization loss [ 0.0088], 2.35 sec/batch.
2022-10-20 21:14:25,354 step [ 703], lr [0.0000750], embedding loss [ 0.8269], quantization loss [ 0.0107], 2.40 sec/batch.
2022-10-20 21:14:29,707 step [ 704], lr [0.0000750], embedding loss [ 0.8209], quantization loss [ 0.0088], 3.07 sec/batch.
2022-10-20 21:14:32,649 step [ 705], lr [0.0000750], embedding loss [ 0.8236], quantization loss [ 0.0108], 1.71 sec/batch.
2022-10-20 21:14:35,900 step [ 706], lr [0.0000750], embedding loss [ 0.8224], quantization loss [ 0.0092], 2.03 sec/batch.
2022-10-20 21:14:39,290 step [ 707], lr [0.0000750], embedding loss [ 0.8143], quantization loss [ 0.0103], 2.15 sec/batch.
2022-10-20 21:14:42,571 step [ 708], lr [0.0000750], embedding loss [ 0.8217], quantization loss [ 0.0100], 1.77 sec/batch.
2022-10-20 21:14:45,983 step [ 709], lr [0.0000750], embedding loss [ 0.8197], quantization loss [ 0.0094], 2.17 sec/batch.
2022-10-20 21:14:49,461 step [ 710], lr [0.0000750], embedding loss [ 0.8187], quantization loss [ 0.0108], 2.23 sec/batch.
2022-10-20 21:14:52,785 step [ 711], lr [0.0000750], embedding loss [ 0.8217], quantization loss [ 0.0102], 2.10 sec/batch.
2022-10-20 21:14:56,085 step [ 712], lr [0.0000750], embedding loss [ 0.8260], quantization loss [ 0.0105], 2.05 sec/batch.
2022-10-20 21:14:59,481 step [ 713], lr [0.0000750], embedding loss [ 0.8217], quantization loss [ 0.0102], 2.15 sec/batch.
2022-10-20 21:15:02,864 step [ 714], lr [0.0000750], embedding loss [ 0.8110], quantization loss [ 0.0099], 2.13 sec/batch.
2022-10-20 21:15:06,267 step [ 715], lr [0.0000750], embedding loss [ 0.8254], quantization loss [ 0.0099], 2.15 sec/batch.
2022-10-20 21:15:09,743 step [ 716], lr [0.0000750], embedding loss [ 0.8142], quantization loss [ 0.0099], 2.17 sec/batch.
2022-10-20 21:15:13,236 step [ 717], lr [0.0000750], embedding loss [ 0.8241], quantization loss [ 0.0101], 2.24 sec/batch.
2022-10-20 21:15:16,518 step [ 718], lr [0.0000750], embedding loss [ 0.8218], quantization loss [ 0.0098], 2.05 sec/batch.
2022-10-20 21:15:19,847 step [ 719], lr [0.0000750], embedding loss [ 0.8202], quantization loss [ 0.0096], 2.05 sec/batch.
2022-10-20 21:15:23,208 step [ 720], lr [0.0000750], embedding loss [ 0.8192], quantization loss [ 0.0107], 2.09 sec/batch.
2022-10-20 21:15:26,587 step [ 721], lr [0.0000750], embedding loss [ 0.8270], quantization loss [ 0.0099], 2.13 sec/batch.
2022-10-20 21:15:29,991 step [ 722], lr [0.0000750], embedding loss [ 0.8186], quantization loss [ 0.0101], 2.15 sec/batch.
2022-10-20 21:15:33,445 step [ 723], lr [0.0000750], embedding loss [ 0.8099], quantization loss [ 0.0100], 2.17 sec/batch.
2022-10-20 21:15:36,877 step [ 724], lr [0.0000750], embedding loss [ 0.8192], quantization loss [ 0.0098], 2.18 sec/batch.
2022-10-20 21:15:40,373 step [ 725], lr [0.0000750], embedding loss [ 0.8205], quantization loss [ 0.0099], 2.24 sec/batch.
2022-10-20 21:15:43,825 step [ 726], lr [0.0000750], embedding loss [ 0.8212], quantization loss [ 0.0099], 2.20 sec/batch.
2022-10-20 21:15:47,157 step [ 727], lr [0.0000750], embedding loss [ 0.8213], quantization loss [ 0.0108], 2.08 sec/batch.
2022-10-20 21:15:50,585 step [ 728], lr [0.0000750], embedding loss [ 0.8130], quantization loss [ 0.0097], 2.17 sec/batch.
2022-10-20 21:15:53,986 step [ 729], lr [0.0000750], embedding loss [ 0.8192], quantization loss [ 0.0089], 2.15 sec/batch.
2022-10-20 21:15:57,447 step [ 730], lr [0.0000750], embedding loss [ 0.8180], quantization loss [ 0.0098], 2.20 sec/batch.
2022-10-20 21:16:00,840 step [ 731], lr [0.0000750], embedding loss [ 0.8239], quantization loss [ 0.0109], 2.14 sec/batch.
2022-10-20 21:16:00,840 update codes and centers iter(1/1).
2022-10-20 21:16:06,625 number of update_code wrong: 0.
2022-10-20 21:16:09,219 non zero codewords: 768.
2022-10-20 21:16:09,219 finish center update, duration: 8.38 sec.
2022-10-20 21:16:12,432 step [ 732], lr [0.0000750], embedding loss [ 0.8228], quantization loss [ 0.0090], 2.05 sec/batch.
2022-10-20 21:16:16,482 step [ 733], lr [0.0000750], embedding loss [ 0.8170], quantization loss [ 0.0103], 2.80 sec/batch.
2022-10-20 21:16:19,983 step [ 734], lr [0.0000750], embedding loss [ 0.8189], quantization loss [ 0.0085], 2.20 sec/batch.
2022-10-20 21:16:23,435 step [ 735], lr [0.0000750], embedding loss [ 0.8156], quantization loss [ 0.0095], 2.14 sec/batch.
2022-10-20 21:16:26,754 step [ 736], lr [0.0000750], embedding loss [ 0.8188], quantization loss [ 0.0098], 2.12 sec/batch.
2022-10-20 21:16:30,164 step [ 737], lr [0.0000750], embedding loss [ 0.8177], quantization loss [ 0.0098], 2.17 sec/batch.
2022-10-20 21:16:33,573 step [ 738], lr [0.0000750], embedding loss [ 0.8270], quantization loss [ 0.0092], 2.15 sec/batch.
2022-10-20 21:16:37,014 step [ 739], lr [0.0000750], embedding loss [ 0.8230], quantization loss [ 0.0087], 2.18 sec/batch.
2022-10-20 21:16:40,294 step [ 740], lr [0.0000750], embedding loss [ 0.8173], quantization loss [ 0.0082], 2.04 sec/batch.
2022-10-20 21:16:43,575 step [ 741], lr [0.0000750], embedding loss [ 0.8247], quantization loss [ 0.0093], 2.09 sec/batch.
2022-10-20 21:16:46,916 step [ 742], lr [0.0000750], embedding loss [ 0.8221], quantization loss [ 0.0105], 2.10 sec/batch.
2022-10-20 21:16:50,361 step [ 743], lr [0.0000750], embedding loss [ 0.8157], quantization loss [ 0.0088], 2.18 sec/batch.
2022-10-20 21:16:53,719 step [ 744], lr [0.0000750], embedding loss [ 0.8196], quantization loss [ 0.0088], 2.14 sec/batch.
2022-10-20 21:16:57,109 step [ 745], lr [0.0000750], embedding loss [ 0.8247], quantization loss [ 0.0088], 2.15 sec/batch.
2022-10-20 21:17:00,514 step [ 746], lr [0.0000750], embedding loss [ 0.8188], quantization loss [ 0.0083], 2.14 sec/batch.
2022-10-20 21:17:04,187 step [ 747], lr [0.0000750], embedding loss [ 0.8235], quantization loss [ 0.0096], 2.38 sec/batch.
2022-10-20 21:17:07,517 step [ 748], lr [0.0000750], embedding loss [ 0.8222], quantization loss [ 0.0088], 2.06 sec/batch.
2022-10-20 21:17:10,964 step [ 749], lr [0.0000750], embedding loss [ 0.8232], quantization loss [ 0.0086], 2.18 sec/batch.
2022-10-20 21:17:14,441 step [ 750], lr [0.0000750], embedding loss [ 0.8257], quantization loss [ 0.0088], 2.18 sec/batch.
2022-10-20 21:17:17,752 step [ 751], lr [0.0000750], embedding loss [ 0.8225], quantization loss [ 0.0095], 2.04 sec/batch.
2022-10-20 21:17:21,222 step [ 752], lr [0.0000750], embedding loss [ 0.8208], quantization loss [ 0.0088], 2.20 sec/batch.
2022-10-20 21:17:24,699 step [ 753], lr [0.0000750], embedding loss [ 0.8217], quantization loss [ 0.0094], 2.22 sec/batch.
2022-10-20 21:17:28,196 step [ 754], lr [0.0000750], embedding loss [ 0.8191], quantization loss [ 0.0096], 2.25 sec/batch.
2022-10-20 21:17:31,657 step [ 755], lr [0.0000750], embedding loss [ 0.8162], quantization loss [ 0.0095], 2.20 sec/batch.
2022-10-20 21:17:35,006 step [ 756], lr [0.0000750], embedding loss [ 0.8215], quantization loss [ 0.0092], 2.08 sec/batch.
2022-10-20 21:17:38,851 step [ 757], lr [0.0000750], embedding loss [ 0.8166], quantization loss [ 0.0094], 2.52 sec/batch.
2022-10-20 21:17:42,269 step [ 758], lr [0.0000750], embedding loss [ 0.8231], quantization loss [ 0.0093], 2.15 sec/batch.
2022-10-20 21:17:45,747 step [ 759], lr [0.0000750], embedding loss [ 0.8107], quantization loss [ 0.0092], 2.21 sec/batch.
2022-10-20 21:17:49,221 step [ 760], lr [0.0000750], embedding loss [ 0.8249], quantization loss [ 0.0088], 2.23 sec/batch.
2022-10-20 21:17:52,651 step [ 761], lr [0.0000750], embedding loss [ 0.8212], quantization loss [ 0.0094], 2.18 sec/batch.
2022-10-20 21:17:56,160 step [ 762], lr [0.0000750], embedding loss [ 0.8218], quantization loss [ 0.0092], 2.25 sec/batch.
2022-10-20 21:17:59,469 step [ 763], lr [0.0000750], embedding loss [ 0.8176], quantization loss [ 0.0091], 2.03 sec/batch.
2022-10-20 21:18:02,848 step [ 764], lr [0.0000750], embedding loss [ 0.8173], quantization loss [ 0.0091], 2.12 sec/batch.
2022-10-20 21:18:06,241 step [ 765], lr [0.0000750], embedding loss [ 0.8206], quantization loss [ 0.0096], 2.13 sec/batch.
2022-10-20 21:18:09,665 step [ 766], lr [0.0000750], embedding loss [ 0.8169], quantization loss [ 0.0094], 2.14 sec/batch.
2022-10-20 21:18:13,184 step [ 767], lr [0.0000750], embedding loss [ 0.8315], quantization loss [ 0.0088], 2.22 sec/batch.
2022-10-20 21:18:16,486 step [ 768], lr [0.0000750], embedding loss [ 0.8168], quantization loss [ 0.0093], 2.04 sec/batch.
2022-10-20 21:18:19,875 step [ 769], lr [0.0000750], embedding loss [ 0.8218], quantization loss [ 0.0093], 2.14 sec/batch.
2022-10-20 21:18:23,328 step [ 770], lr [0.0000750], embedding loss [ 0.8130], quantization loss [ 0.0086], 2.19 sec/batch.
2022-10-20 21:18:26,701 step [ 771], lr [0.0000750], embedding loss [ 0.8281], quantization loss [ 0.0099], 2.14 sec/batch.
2022-10-20 21:18:30,148 step [ 772], lr [0.0000750], embedding loss [ 0.8247], quantization loss [ 0.0093], 2.18 sec/batch.
2022-10-20 21:18:33,409 step [ 773], lr [0.0000750], embedding loss [ 0.8197], quantization loss [ 0.0082], 2.05 sec/batch.
2022-10-20 21:18:36,789 step [ 774], lr [0.0000750], embedding loss [ 0.8155], quantization loss [ 0.0090], 2.15 sec/batch.
2022-10-20 21:18:40,868 step [ 775], lr [0.0000750], embedding loss [ 0.8211], quantization loss [ 0.0085], 2.54 sec/batch.
2022-10-20 21:18:44,292 step [ 776], lr [0.0000750], embedding loss [ 0.8216], quantization loss [ 0.0086], 2.16 sec/batch.
2022-10-20 21:18:47,602 step [ 777], lr [0.0000750], embedding loss [ 0.8217], quantization loss [ 0.0094], 2.06 sec/batch.
2022-10-20 21:18:51,670 step [ 778], lr [0.0000750], embedding loss [ 0.8201], quantization loss [ 0.0090], 2.82 sec/batch.
2022-10-20 21:18:55,165 step [ 779], lr [0.0000750], embedding loss [ 0.8228], quantization loss [ 0.0093], 2.21 sec/batch.
2022-10-20 21:18:58,523 step [ 780], lr [0.0000750], embedding loss [ 0.8252], quantization loss [ 0.0088], 2.10 sec/batch.
2022-10-20 21:19:01,867 step [ 781], lr [0.0000750], embedding loss [ 0.8195], quantization loss [ 0.0098], 2.09 sec/batch.
2022-10-20 21:19:05,368 step [ 782], lr [0.0000750], embedding loss [ 0.8138], quantization loss [ 0.0085], 2.21 sec/batch.
2022-10-20 21:19:08,765 step [ 783], lr [0.0000750], embedding loss [ 0.8248], quantization loss [ 0.0092], 2.15 sec/batch.
2022-10-20 21:19:12,103 step [ 784], lr [0.0000750], embedding loss [ 0.8183], quantization loss [ 0.0086], 2.07 sec/batch.
2022-10-20 21:19:15,552 step [ 785], lr [0.0000750], embedding loss [ 0.8220], quantization loss [ 0.0091], 2.17 sec/batch.
2022-10-20 21:19:19,052 step [ 786], lr [0.0000750], embedding loss [ 0.8228], quantization loss [ 0.0091], 2.17 sec/batch.
2022-10-20 21:19:22,492 step [ 787], lr [0.0000750], embedding loss [ 0.8233], quantization loss [ 0.0080], 2.18 sec/batch.
2022-10-20 21:19:25,889 step [ 788], lr [0.0000750], embedding loss [ 0.8163], quantization loss [ 0.0086], 2.13 sec/batch.
2022-10-20 21:19:29,295 step [ 789], lr [0.0000750], embedding loss [ 0.8276], quantization loss [ 0.0089], 2.18 sec/batch.
2022-10-20 21:19:32,743 step [ 790], lr [0.0000750], embedding loss [ 0.8239], quantization loss [ 0.0093], 2.18 sec/batch.
2022-10-20 21:19:36,029 step [ 791], lr [0.0000750], embedding loss [ 0.8288], quantization loss [ 0.0086], 2.06 sec/batch.
2022-10-20 21:19:39,451 step [ 792], lr [0.0000750], embedding loss [ 0.8242], quantization loss [ 0.0087], 2.17 sec/batch.
2022-10-20 21:19:42,767 step [ 793], lr [0.0000750], embedding loss [ 0.8238], quantization loss [ 0.0084], 2.09 sec/batch.
2022-10-20 21:19:46,245 step [ 794], lr [0.0000750], embedding loss [ 0.8166], quantization loss [ 0.0090], 2.21 sec/batch.
2022-10-20 21:19:49,646 step [ 795], lr [0.0000750], embedding loss [ 0.8119], quantization loss [ 0.0093], 2.14 sec/batch.
2022-10-20 21:19:53,757 step [ 796], lr [0.0000750], embedding loss [ 0.8285], quantization loss [ 0.0086], 2.85 sec/batch.
2022-10-20 21:19:57,282 step [ 797], lr [0.0000750], embedding loss [ 0.8274], quantization loss [ 0.0087], 2.25 sec/batch.
2022-10-20 21:20:00,699 step [ 798], lr [0.0000750], embedding loss [ 0.8168], quantization loss [ 0.0087], 2.15 sec/batch.
2022-10-20 21:20:04,709 step [ 799], lr [0.0000750], embedding loss [ 0.8171], quantization loss [ 0.0104], 2.70 sec/batch.
2022-10-20 21:20:08,025 step [ 800], lr [0.0000750], embedding loss [ 0.8113], quantization loss [ 0.0092], 2.04 sec/batch.
2022-10-20 21:20:08,025 finish training iterations and begin saving model.
2022-10-20 21:20:13,816 finish model saving.
2022-10-20 21:20:13,816 finish training, model saved under ./checkpoints/flickr_WSDQH_nbits=24_adaMargin_gamma=1_lambda=0.0001_0003.npy.
2022-10-20 21:20:17,199 prepare dataset.
2022-10-20 21:20:17,848 prepare data loader.
2022-10-20 21:20:17,848 Initializing DataLoader.
2022-10-20 21:20:17,848 DataLoader already.
2022-10-20 21:20:17,849 Initializing DataLoader.
2022-10-20 21:20:17,849 DataLoader already.
2022-10-20 21:20:17,849 prepare model.
2022-10-20 21:20:18,071 Number of semantic embeddings: 1178.
2022-10-20 21:20:35,339 begin validation.
2022-10-20 21:20:57,746 finish query feature extraction, duration: 22.41 sec.
2022-10-20 21:24:29,253 finish database feature extraction, duration: 211.51 sec.
2022-10-20 21:24:29,254 compute quantization codes for query.
2022-10-20 21:24:30,753 number of update_code wrong: 0.
2022-10-20 21:24:30,753 finish query encoding, duration: 1.50 sec.
2022-10-20 21:24:30,753 compute quantization codes for database.
2022-10-20 21:24:37,786 number of update_code wrong: 0.
2022-10-20 21:24:37,786 finish database encoding, duration: 7.03 sec.
2022-10-20 21:24:37,786 save retrieval information: codes, features, reconstructions of queries and database.
2022-10-20 21:24:38,681 begin to calculate MAP@5000.
2022-10-20 21:24:38,682 begin to calculate AQD mAP@5000.
2022-10-20 21:24:42,252 AQD mAP@5000 = [0.7653], duration: 3.57 sec.
2022-10-20 21:24:42,252 begin to calculate SQD mAP@5000.
2022-10-20 21:24:45,601 SQD mAP@5000 = [0.7639], duration: 3.35 sec.
2022-10-20 21:24:45,601 begin to calculate feats mAP@5000.
2022-10-20 21:24:49,238 feats mAP@5000 = [0.7677], duration: 3.64 sec.
2022-10-20 21:24:49,239 finish validation.