2023-02-18 13:15:36,723 32k INFO {'train': {'log_interval': 200, 'eval_interval': 1000, 'seed': 1234, 'epochs': 10000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 6, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 17920, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 384, 'port': '8001'}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 32000, 'filter_length': 1280, 'hop_length': 320, 'win_length': 1280, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': None}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [10, 8, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 256, 'ssl_dim': 256, 'n_speakers': 2}, 'spk': {'asahi': 0}, 'model_dir': './logs\\32k'} 2023-02-18 13:15:36,724 32k WARNING K:\AI\so-vits-svc-32k is not a git repository, therefore hash value comparison will be ignored. 2023-02-18 13:16:46,085 32k INFO {'train': {'log_interval': 200, 'eval_interval': 1000, 'seed': 1234, 'epochs': 10000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 6, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 17920, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 384, 'port': '8001'}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 32000, 'filter_length': 1280, 'hop_length': 320, 'win_length': 1280, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': None}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [10, 8, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 256, 'ssl_dim': 256, 'n_speakers': 2}, 'spk': {'asahi': 0}, 'model_dir': './logs\\32k'} 2023-02-18 13:16:46,086 32k WARNING K:\AI\so-vits-svc-32k is not a git repository, therefore hash value comparison will be ignored. 2023-02-18 13:16:51,625 32k INFO emb_g.weight is not in the checkpoint 2023-02-18 13:16:51,737 32k INFO Loaded checkpoint './logs\32k\G_0.pth' (iteration 1) 2023-02-18 13:16:52,797 32k INFO Loaded checkpoint './logs\32k\D_0.pth' (iteration 1) 2023-02-18 13:17:24,697 32k INFO Train Epoch: 1 [0%] 2023-02-18 13:17:24,698 32k INFO [3.3069984912872314, 2.2735390663146973, 12.671528816223145, 44.619842529296875, 10.824443817138672, 0, 0.0001] 2023-02-18 13:17:30,498 32k INFO Saving model and optimizer state at iteration 1 to ./logs\32k\G_0.pth 2023-02-18 13:17:45,076 32k INFO Saving model and optimizer state at iteration 1 to ./logs\32k\D_0.pth 2023-02-18 13:19:47,084 32k INFO ====> Epoch: 1 2023-02-18 13:22:06,314 32k INFO ====> Epoch: 2 2023-02-18 13:23:20,386 32k INFO Train Epoch: 3 [44%] 2023-02-18 13:23:20,387 32k INFO [2.3857059478759766, 2.734795093536377, 12.341865539550781, 25.23939323425293, 1.1707755327224731, 200, 9.99750015625e-05] 2023-02-18 13:24:25,089 32k INFO ====> Epoch: 3 2023-02-18 13:26:44,542 32k INFO ====> Epoch: 4 2023-02-18 13:28:50,668 32k INFO Train Epoch: 5 [88%] 2023-02-18 13:28:50,668 32k INFO [2.599891185760498, 2.339749813079834, 8.97850513458252, 18.280649185180664, 1.2124148607254028, 400, 9.995000937421877e-05] 2023-02-18 13:29:04,051 32k INFO ====> Epoch: 5 2023-02-18 13:31:22,495 32k INFO ====> Epoch: 6 2023-02-18 13:33:41,390 32k INFO ====> Epoch: 7 2023-02-18 13:34:46,904 32k INFO Train Epoch: 8 [32%] 2023-02-18 13:34:46,904 32k INFO [2.428616523742676, 2.5334784984588623, 7.50093936920166, 13.943485260009766, 1.0122454166412354, 600, 9.991253280566489e-05] 2023-02-18 13:36:46,611 32k INFO ====> Epoch: 8 2023-02-18 13:39:42,611 32k INFO ====> Epoch: 9 2023-02-18 13:42:01,156 32k INFO Train Epoch: 10 [76%] 2023-02-18 13:42:01,156 32k INFO [2.539777994155884, 1.9696028232574463, 10.870598793029785, 20.18303680419922, 1.0394906997680664, 800, 9.98875562335968e-05] 2023-02-18 13:42:36,519 32k INFO ====> Epoch: 10 2023-02-18 13:45:30,606 32k INFO ====> Epoch: 11 2023-02-18 13:48:27,421 32k INFO ====> Epoch: 12 2023-02-18 13:49:20,806 32k INFO Train Epoch: 13 [20%] 2023-02-18 13:49:20,806 32k INFO [2.5772485733032227, 2.3823914527893066, 10.536639213562012, 20.34115219116211, 0.9894436597824097, 1000, 9.98501030820433e-05] 2023-02-18 13:49:25,467 32k INFO Saving model and optimizer state at iteration 13 to ./logs\32k\G_1000.pth 2023-02-18 13:49:42,627 32k INFO Saving model and optimizer state at iteration 13 to ./logs\32k\D_1000.pth 2023-02-18 13:51:31,762 32k INFO ====> Epoch: 13 2023-02-18 13:53:49,535 32k INFO ====> Epoch: 14 2023-02-18 13:55:46,291 32k INFO Train Epoch: 15 [63%] 2023-02-18 13:55:46,292 32k INFO [2.4663658142089844, 2.320800542831421, 11.314515113830566, 21.392370223999023, 1.4117159843444824, 1200, 9.982514211643064e-05] 2023-02-18 13:56:38,705 32k INFO ====> Epoch: 15 2023-02-18 13:59:28,519 32k INFO ====> Epoch: 16 2023-02-18 14:02:17,766 32k INFO ====> Epoch: 17 2023-02-18 14:02:53,341 32k INFO Train Epoch: 18 [7%] 2023-02-18 14:02:53,341 32k INFO [2.418344736099243, 2.726172924041748, 12.386476516723633, 20.22224998474121, 1.1661674976348877, 1400, 9.978771236724554e-05] 2023-02-18 14:04:58,741 32k INFO ====> Epoch: 18 2023-02-18 14:07:40,854 32k INFO ====> Epoch: 19 2023-02-18 14:09:22,959 32k INFO Train Epoch: 20 [51%] 2023-02-18 14:09:22,959 32k INFO [2.392646551132202, 2.6650896072387695, 11.132346153259277, 20.615514755249023, 1.3077448606491089, 1600, 9.976276699833672e-05] 2023-02-18 14:10:37,010 32k INFO ====> Epoch: 20 2023-02-18 14:13:17,208 32k INFO ====> Epoch: 21 2023-02-18 14:15:24,422 32k INFO Train Epoch: 22 [95%] 2023-02-18 14:15:24,423 32k INFO [2.4040393829345703, 2.239150047302246, 12.231693267822266, 20.794464111328125, 0.9484589695930481, 1800, 9.973782786538036e-05] 2023-02-18 14:15:30,868 32k INFO ====> Epoch: 22 2023-02-18 14:17:59,875 32k INFO ====> Epoch: 23 2023-02-18 14:20:20,355 32k INFO ====> Epoch: 24 2023-02-18 14:21:30,802 32k INFO Train Epoch: 25 [39%] 2023-02-18 14:21:30,802 32k INFO [2.4141862392425537, 2.3016927242279053, 13.739073753356934, 21.9274959564209, 1.2016680240631104, 2000, 9.970043085494672e-05] 2023-02-18 14:21:35,541 32k INFO Saving model and optimizer state at iteration 25 to ./logs\32k\G_2000.pth 2023-02-18 14:21:51,929 32k INFO Saving model and optimizer state at iteration 25 to ./logs\32k\D_2000.pth 2023-02-18 14:23:05,507 32k INFO ====> Epoch: 25 2023-02-18 14:25:24,529 32k INFO ====> Epoch: 26 2023-02-18 14:27:25,212 32k INFO Train Epoch: 27 [83%] 2023-02-18 14:27:25,212 32k INFO [2.485182285308838, 2.1557278633117676, 8.625478744506836, 17.08354377746582, 0.8900912404060364, 2200, 9.967550730505221e-05] 2023-02-18 14:27:44,585 32k INFO ====> Epoch: 27 2023-02-18 14:30:04,953 32k INFO ====> Epoch: 28 2023-02-18 14:32:24,245 32k INFO ====> Epoch: 29 2023-02-18 14:33:19,017 32k INFO Train Epoch: 30 [27%] 2023-02-18 14:33:19,018 32k INFO [2.314979076385498, 2.591787099838257, 8.379956245422363, 17.788272857666016, 0.9263051152229309, 2400, 9.963813366190753e-05] 2023-02-18 14:34:43,546 32k INFO ====> Epoch: 30 2023-02-18 14:37:04,473 32k INFO ====> Epoch: 31 2023-02-18 14:38:51,951 32k INFO Train Epoch: 32 [71%] 2023-02-18 14:38:51,952 32k INFO [2.530869960784912, 2.346663475036621, 9.290014266967773, 17.95098114013672, 0.8468811511993408, 2600, 9.961322568533789e-05] 2023-02-18 14:39:25,876 32k INFO ====> Epoch: 32 2023-02-18 14:41:46,299 32k INFO ====> Epoch: 33 2023-02-18 14:44:06,079 32k INFO ====> Epoch: 34 2023-02-18 14:44:47,360 32k INFO Train Epoch: 35 [15%] 2023-02-18 14:44:47,360 32k INFO [2.244014024734497, 2.303522825241089, 12.0291109085083, 20.503751754760742, 1.072399616241455, 2800, 9.957587539488128e-05] 2023-02-18 14:46:28,086 32k INFO ====> Epoch: 35 2023-02-18 14:48:49,406 32k INFO ====> Epoch: 36 2023-02-18 14:50:21,699 32k INFO Train Epoch: 37 [59%] 2023-02-18 14:50:21,700 32k INFO [2.5226240158081055, 2.235443115234375, 8.532631874084473, 17.06432342529297, 0.766471266746521, 3000, 9.95509829819056e-05] 2023-02-18 14:50:26,448 32k INFO Saving model and optimizer state at iteration 37 to ./logs\32k\G_3000.pth 2023-02-18 14:50:44,023 32k INFO Saving model and optimizer state at iteration 37 to ./logs\32k\D_3000.pth 2023-02-18 14:51:35,085 32k INFO ====> Epoch: 37 2023-02-18 14:53:54,527 32k INFO ====> Epoch: 38 2023-02-18 14:56:14,690 32k INFO ====> Epoch: 39 2023-02-18 14:56:41,714 32k INFO Train Epoch: 40 [2%] 2023-02-18 14:56:41,715 32k INFO [2.545912027359009, 2.1446070671081543, 11.442967414855957, 19.688390731811523, 1.1071820259094238, 3200, 9.951365602954526e-05] 2023-02-18 14:58:35,254 32k INFO ====> Epoch: 40 2023-02-18 15:00:55,720 32k INFO ====> Epoch: 41 2023-02-18 15:02:14,254 32k INFO Train Epoch: 42 [46%] 2023-02-18 15:02:14,256 32k INFO [2.4656083583831787, 2.3087027072906494, 8.831151962280273, 20.15584373474121, 0.6311391592025757, 3400, 9.948877917043875e-05] 2023-02-18 15:03:21,337 32k INFO ====> Epoch: 42 2023-02-18 15:05:42,560 32k INFO ====> Epoch: 43 2023-02-18 15:07:53,352 32k INFO Train Epoch: 44 [90%] 2023-02-18 15:07:53,353 32k INFO [2.423257827758789, 2.4318912029266357, 8.438339233398438, 18.954322814941406, 1.1647067070007324, 3600, 9.94639085301583e-05] 2023-02-18 15:08:03,537 32k INFO ====> Epoch: 44 2023-02-18 15:10:23,855 32k INFO ====> Epoch: 45 2023-02-18 15:12:44,427 32k INFO ====> Epoch: 46 2023-02-18 15:13:48,785 32k INFO Train Epoch: 47 [34%] 2023-02-18 15:13:48,786 32k INFO [2.1796677112579346, 2.7173850536346436, 11.570164680480957, 17.406679153442383, 1.1616911888122559, 3800, 9.942661422663591e-05] 2023-02-18 15:15:05,864 32k INFO ====> Epoch: 47 2023-02-18 15:17:26,837 32k INFO ====> Epoch: 48 2023-02-18 15:19:22,655 32k INFO Train Epoch: 49 [78%] 2023-02-18 15:19:22,656 32k INFO [2.7236239910125732, 2.036970376968384, 4.593118667602539, 7.638784408569336, 1.1607017517089844, 4000, 9.940175912662009e-05] 2023-02-18 15:19:27,513 32k INFO Saving model and optimizer state at iteration 49 to ./logs\32k\G_4000.pth 2023-02-18 15:19:45,391 32k INFO Saving model and optimizer state at iteration 49 to ./logs\32k\D_4000.pth 2023-02-18 15:20:13,800 32k INFO ====> Epoch: 49 2023-02-18 15:22:33,780 32k INFO ====> Epoch: 50 2023-02-18 15:24:53,853 32k INFO ====> Epoch: 51 2023-02-18 15:25:43,949 32k INFO Train Epoch: 52 [22%] 2023-02-18 15:25:43,949 32k INFO [2.6045544147491455, 2.4030346870422363, 10.298624992370605, 17.781221389770508, 0.9256278276443481, 4200, 9.936448812621091e-05] 2023-02-18 15:27:14,911 32k INFO ====> Epoch: 52 2023-02-18 15:29:35,372 32k INFO ====> Epoch: 53 2023-02-18 15:31:16,948 32k INFO Train Epoch: 54 [66%] 2023-02-18 15:31:16,948 32k INFO [2.4962995052337646, 2.134711503982544, 9.151311874389648, 16.35634422302246, 0.862859308719635, 4400, 9.933964855674948e-05] 2023-02-18 15:31:56,360 32k INFO ====> Epoch: 54 2023-02-18 15:34:16,854 32k INFO ====> Epoch: 55 2023-02-18 15:36:36,402 32k INFO ====> Epoch: 56 2023-02-18 15:37:12,111 32k INFO Train Epoch: 57 [10%] 2023-02-18 15:37:12,122 32k INFO [2.2373344898223877, 2.4235143661499023, 11.974388122558594, 18.775772094726562, 0.9912294745445251, 4600, 9.930240084489267e-05] 2023-02-18 15:38:57,440 32k INFO ====> Epoch: 57 2023-02-18 15:41:17,548 32k INFO ====> Epoch: 58 2023-02-18 15:42:44,716 32k INFO Train Epoch: 59 [54%] 2023-02-18 15:42:44,716 32k INFO [2.32612943649292, 2.505492687225342, 13.369050025939941, 21.683101654052734, 1.2278294563293457, 4800, 9.927757679628145e-05] 2023-02-18 15:43:38,469 32k INFO ====> Epoch: 59 2023-02-18 15:45:59,553 32k INFO ====> Epoch: 60 2023-02-18 15:48:18,553 32k INFO Train Epoch: 61 [98%] 2023-02-18 15:48:18,554 32k INFO [2.332657814025879, 2.309508800506592, 11.992413520812988, 21.970182418823242, 1.1714390516281128, 5000, 9.92527589532945e-05] 2023-02-18 15:48:23,341 32k INFO Saving model and optimizer state at iteration 61 to ./logs\32k\G_5000.pth 2023-02-18 15:48:41,872 32k INFO Saving model and optimizer state at iteration 61 to ./logs\32k\D_5000.pth 2023-02-18 15:48:47,201 32k INFO ====> Epoch: 61 2023-02-18 15:51:06,990 32k INFO ====> Epoch: 62 2023-02-18 15:53:26,772 32k INFO ====> Epoch: 63 2023-02-18 15:54:39,947 32k INFO Train Epoch: 64 [41%] 2023-02-18 15:54:39,947 32k INFO [2.581521987915039, 2.197911500930786, 10.526808738708496, 18.230852127075195, 0.5856090784072876, 5200, 9.921554382096622e-05] 2023-02-18 15:56:05,134 32k INFO ====> Epoch: 64 2023-02-18 15:59:16,071 32k INFO ====> Epoch: 65 2023-02-18 16:02:00,609 32k INFO Train Epoch: 66 [85%] 2023-02-18 16:02:00,610 32k INFO [2.4432015419006348, 2.4601447582244873, 11.853116989135742, 20.211322784423828, 0.9781383872032166, 5400, 9.919074148525384e-05] 2023-02-18 16:02:23,489 32k INFO ====> Epoch: 66 2023-02-18 16:05:35,609 32k INFO ====> Epoch: 67 2023-02-18 16:08:44,666 32k INFO ====> Epoch: 68 2023-02-18 16:09:56,545 32k INFO Train Epoch: 69 [29%] 2023-02-18 16:09:56,546 32k INFO [2.181422233581543, 2.8529210090637207, 11.916988372802734, 21.838470458984375, 1.0593645572662354, 5600, 9.915354960656915e-05] 2023-02-18 16:11:49,999 32k INFO ====> Epoch: 69 2023-02-18 16:14:57,604 32k INFO ====> Epoch: 70 2023-02-18 16:17:27,420 32k INFO Train Epoch: 71 [73%] 2023-02-18 16:17:27,420 32k INFO [1.9035147428512573, 3.021906852722168, 15.565659523010254, 21.79709815979004, 0.946706235408783, 5800, 9.912876276844171e-05] 2023-02-18 16:18:11,589 32k INFO ====> Epoch: 71 2023-02-18 16:21:19,944 32k INFO ====> Epoch: 72 2023-02-18 16:24:31,441 32k INFO ====> Epoch: 73 2023-02-18 16:25:23,054 32k INFO Train Epoch: 74 [17%] 2023-02-18 16:25:23,054 32k INFO [2.3623337745666504, 2.5799450874328613, 7.930063247680664, 17.0313777923584, 1.2069950103759766, 6000, 9.909159412887068e-05] 2023-02-18 16:25:27,842 32k INFO Saving model and optimizer state at iteration 74 to ./logs\32k\G_6000.pth 2023-02-18 16:25:48,638 32k INFO Saving model and optimizer state at iteration 74 to ./logs\32k\D_6000.pth 2023-02-18 16:28:07,473 32k INFO ====> Epoch: 74 2023-02-18 16:31:15,829 32k INFO ====> Epoch: 75 2023-02-18 16:33:18,999 32k INFO Train Epoch: 76 [61%] 2023-02-18 16:33:18,999 32k INFO [2.2943310737609863, 2.8715434074401855, 10.26145076751709, 14.741254806518555, 1.2504029273986816, 6200, 9.906682277864462e-05] 2023-02-18 16:34:18,996 32k INFO ====> Epoch: 76 2023-02-18 16:37:25,713 32k INFO ====> Epoch: 77 2023-02-18 16:40:29,196 32k INFO ====> Epoch: 78 2023-02-18 16:41:01,456 32k INFO Train Epoch: 79 [5%] 2023-02-18 16:41:01,457 32k INFO [2.1732935905456543, 2.650221347808838, 16.90458869934082, 23.275943756103516, 0.5908531546592712, 6400, 9.902967736366644e-05] 2023-02-18 16:43:35,665 32k INFO ====> Epoch: 79 2023-02-18 16:46:40,281 32k INFO ====> Epoch: 80 2023-02-18 16:48:24,511 32k INFO Train Epoch: 81 [49%] 2023-02-18 16:48:24,511 32k INFO [2.488065719604492, 2.189055919647217, 11.57202434539795, 20.690433502197266, 1.0104312896728516, 6600, 9.900492149166423e-05] 2023-02-18 16:49:45,682 32k INFO ====> Epoch: 81 2023-02-18 16:52:48,841 32k INFO ====> Epoch: 82 2023-02-18 16:55:45,194 32k INFO Train Epoch: 83 [93%] 2023-02-18 16:55:45,195 32k INFO [2.363006591796875, 2.5977485179901123, 11.85480785369873, 20.72893714904785, 1.1449049711227417, 6800, 9.89801718082432e-05] 2023-02-18 16:55:54,936 32k INFO ====> Epoch: 83 2023-02-18 16:59:01,228 32k INFO ====> Epoch: 84 2023-02-18 17:02:06,730 32k INFO ====> Epoch: 85 2023-02-18 17:03:30,501 32k INFO Train Epoch: 86 [37%] 2023-02-18 17:03:30,502 32k INFO [2.6049516201019287, 2.1475470066070557, 8.085922241210938, 17.217309951782227, 0.45161640644073486, 7000, 9.894305888331732e-05] 2023-02-18 17:03:35,389 32k INFO Saving model and optimizer state at iteration 86 to ./logs\32k\G_7000.pth 2023-02-18 17:03:53,165 32k INFO Saving model and optimizer state at iteration 86 to ./logs\32k\D_7000.pth 2023-02-18 17:05:34,400 32k INFO ====> Epoch: 86 2023-02-18 17:08:41,181 32k INFO ====> Epoch: 87 2023-02-18 17:11:16,739 32k INFO Train Epoch: 88 [80%] 2023-02-18 17:11:16,739 32k INFO [2.382134199142456, 2.5343871116638184, 9.086328506469727, 14.694384574890137, 0.7267480492591858, 7200, 9.891832466458178e-05] 2023-02-18 17:11:47,110 32k INFO ====> Epoch: 88 2023-02-18 17:14:52,007 32k INFO ====> Epoch: 89 2023-02-18 17:17:56,142 32k INFO ====> Epoch: 90 2023-02-18 17:19:01,057 32k INFO Train Epoch: 91 [24%] 2023-02-18 17:19:01,058 32k INFO [2.4476230144500732, 2.305821418762207, 8.499305725097656, 19.25327491760254, 0.8112967610359192, 7400, 9.888123492943583e-05] 2023-02-18 17:21:03,549 32k INFO ====> Epoch: 91 2023-02-18 17:24:06,841 32k INFO ====> Epoch: 92 2023-02-18 17:26:22,681 32k INFO Train Epoch: 93 [68%] 2023-02-18 17:26:22,688 32k INFO [2.185331344604492, 2.6872944831848145, 15.124022483825684, 21.444419860839844, 1.0479180812835693, 7600, 9.885651616572276e-05] 2023-02-18 17:27:13,773 32k INFO ====> Epoch: 93 2023-02-18 17:30:21,451 32k INFO ====> Epoch: 94 2023-02-18 17:33:29,561 32k INFO ====> Epoch: 95 2023-02-18 17:34:13,186 32k INFO Train Epoch: 96 [12%] 2023-02-18 17:34:13,187 32k INFO [2.347987174987793, 2.395303964614868, 9.463523864746094, 14.174880981445312, 0.40106111764907837, 7800, 9.881944960586671e-05] 2023-02-18 17:36:35,656 32k INFO ====> Epoch: 96 2023-02-18 17:39:42,149 32k INFO ====> Epoch: 97 2023-02-18 17:41:37,677 32k INFO Train Epoch: 98 [56%] 2023-02-18 17:41:37,677 32k INFO [2.503045082092285, 2.4578006267547607, 11.118446350097656, 18.708005905151367, 1.2328639030456543, 8000, 9.879474628751914e-05] 2023-02-18 17:41:42,468 32k INFO Saving model and optimizer state at iteration 98 to ./logs\32k\G_8000.pth 2023-02-18 17:41:59,137 32k INFO Saving model and optimizer state at iteration 98 to ./logs\32k\D_8000.pth 2023-02-18 17:43:11,648 32k INFO ====> Epoch: 98 2023-02-18 17:46:14,721 32k INFO ====> Epoch: 99 2023-02-18 17:49:19,452 32k INFO ====> Epoch: 100 2023-02-18 17:49:43,660 32k INFO Train Epoch: 101 [0%] 2023-02-18 17:49:43,660 32k INFO [2.5995795726776123, 2.3254148960113525, 7.9163641929626465, 16.673885345458984, 0.7785255908966064, 8200, 9.875770288847208e-05] 2023-02-18 17:52:24,475 32k INFO ====> Epoch: 101 2023-02-18 17:55:27,166 32k INFO ====> Epoch: 102 2023-02-18 17:57:03,042 32k INFO Train Epoch: 103 [44%] 2023-02-18 17:57:03,042 32k INFO [2.415909767150879, 2.438415288925171, 10.601877212524414, 19.733566284179688, 0.9555046558380127, 8400, 9.873301500583906e-05] 2023-02-18 17:58:33,665 32k INFO ====> Epoch: 103 2023-02-18 18:01:38,657 32k INFO ====> Epoch: 104 2023-02-18 18:04:28,117 32k INFO Train Epoch: 105 [88%] 2023-02-18 18:04:28,117 32k INFO [2.630995035171509, 2.1970698833465576, 10.794513702392578, 19.191852569580078, 0.7746363878250122, 8600, 9.870833329479095e-05] 2023-02-18 18:04:45,705 32k INFO ====> Epoch: 105 2023-02-18 18:07:52,602 32k INFO ====> Epoch: 106 2023-02-18 18:10:57,947 32k INFO ====> Epoch: 107 2023-02-18 18:12:12,513 32k INFO Train Epoch: 108 [32%] 2023-02-18 18:12:12,513 32k INFO [2.474118709564209, 2.5281760692596436, 7.3929548263549805, 14.464062690734863, 0.6939061880111694, 8800, 9.867132229656573e-05] 2023-02-18 18:14:00,834 32k INFO ====> Epoch: 108 2023-02-18 18:17:07,249 32k INFO ====> Epoch: 109 2023-02-18 18:19:34,317 32k INFO Train Epoch: 110 [76%] 2023-02-18 18:19:34,318 32k INFO [2.6343023777008057, 2.198140859603882, 9.245353698730469, 15.40959644317627, 0.5494029521942139, 9000, 9.864665600773098e-05] 2023-02-18 18:19:39,249 32k INFO Saving model and optimizer state at iteration 110 to ./logs\32k\G_9000.pth 2023-02-18 18:19:56,382 32k INFO Saving model and optimizer state at iteration 110 to ./logs\32k\D_9000.pth 2023-02-18 18:20:37,170 32k INFO ====> Epoch: 110 2023-02-18 18:23:43,211 32k INFO ====> Epoch: 111 2023-02-18 18:26:56,368 32k INFO ====> Epoch: 112 2023-02-18 18:27:52,671 32k INFO Train Epoch: 113 [20%] 2023-02-18 18:27:52,672 32k INFO [2.1823391914367676, 2.942514419555664, 12.15264892578125, 19.10240364074707, 0.7193117141723633, 9200, 9.86096681355974e-05] 2023-02-18 18:30:11,006 32k INFO ====> Epoch: 113 2023-02-18 18:33:21,420 32k INFO ====> Epoch: 114 2023-02-18 18:35:32,306 32k INFO Train Epoch: 115 [63%] 2023-02-18 18:35:32,306 32k INFO [2.3463950157165527, 2.4623875617980957, 13.173855781555176, 21.46563148498535, 0.5739250183105469, 9400, 9.858501725933955e-05] 2023-02-18 18:36:30,806 32k INFO ====> Epoch: 115 2023-02-18 18:39:41,614 32k INFO ====> Epoch: 116 2023-02-18 18:42:56,740 32k INFO ====> Epoch: 117 2023-02-18 18:43:32,935 32k INFO Train Epoch: 118 [7%] 2023-02-18 18:43:32,936 32k INFO [2.4906504154205322, 2.447946786880493, 11.754008293151855, 20.466285705566406, 0.8161544799804688, 9600, 9.854805249884741e-05] 2023-02-18 18:46:10,002 32k INFO ====> Epoch: 118 2023-02-18 18:49:20,693 32k INFO ====> Epoch: 119 2023-02-18 18:51:10,250 32k INFO Train Epoch: 120 [51%] 2023-02-18 18:51:10,251 32k INFO [2.4490957260131836, 2.738856077194214, 12.3360013961792, 19.310625076293945, 0.8394557237625122, 9800, 9.8523417025536e-05] 2023-02-18 18:52:29,403 32k INFO ====> Epoch: 120 2023-02-18 18:55:31,737 32k INFO ====> Epoch: 121 2023-02-18 18:58:27,659 32k INFO Train Epoch: 122 [95%] 2023-02-18 18:58:27,660 32k INFO [2.409554958343506, 2.3621697425842285, 13.538979530334473, 20.507600784301758, 1.1963660717010498, 10000, 9.8498787710708e-05] 2023-02-18 18:58:32,617 32k INFO Saving model and optimizer state at iteration 122 to ./logs\32k\G_10000.pth 2023-02-18 18:58:50,163 32k INFO Saving model and optimizer state at iteration 122 to ./logs\32k\D_10000.pth 2023-02-18 18:58:59,800 32k INFO ====> Epoch: 122 2023-02-18 19:02:03,872 32k INFO ====> Epoch: 123 2023-02-18 19:05:06,686 32k INFO ====> Epoch: 124 2023-02-18 19:06:34,823 32k INFO Train Epoch: 125 [39%] 2023-02-18 19:06:34,824 32k INFO [2.345095634460449, 2.178450345993042, 12.377983093261719, 19.660619735717773, 0.8193285465240479, 10200, 9.846185528225477e-05] 2023-02-18 19:08:12,812 32k INFO ====> Epoch: 125 2023-02-18 19:11:15,775 32k INFO ====> Epoch: 126 2023-02-18 19:13:54,645 32k INFO Train Epoch: 127 [83%] 2023-02-18 19:13:54,645 32k INFO [2.5806427001953125, 2.1979994773864746, 11.834625244140625, 18.687227249145508, 1.0435174703598022, 10400, 9.84372413569007e-05] 2023-02-18 19:14:19,451 32k INFO ====> Epoch: 127 2023-02-18 19:17:26,036 32k INFO ====> Epoch: 128 2023-02-18 19:20:29,081 32k INFO ====> Epoch: 129 2023-02-18 19:21:35,465 32k INFO Train Epoch: 130 [27%] 2023-02-18 19:21:35,466 32k INFO [2.351133346557617, 2.378753423690796, 10.939573287963867, 19.32187271118164, 0.8272395730018616, 10600, 9.840033200544528e-05] 2023-02-18 19:23:33,960 32k INFO ====> Epoch: 130 2023-02-18 19:26:38,810 32k INFO ====> Epoch: 131 2023-02-18 19:28:57,008 32k INFO Train Epoch: 132 [71%] 2023-02-18 19:28:57,048 32k INFO [2.4666779041290283, 2.4331767559051514, 12.116047859191895, 20.228519439697266, 1.3147556781768799, 10800, 9.837573345994909e-05] 2023-02-18 19:29:43,726 32k INFO ====> Epoch: 132 2023-02-18 19:32:50,848 32k INFO ====> Epoch: 133 2023-02-18 19:35:56,257 32k INFO ====> Epoch: 134 2023-02-18 19:36:44,716 32k INFO Train Epoch: 135 [15%] 2023-02-18 19:36:44,716 32k INFO [2.255558490753174, 2.5198585987091064, 12.21342945098877, 20.34984588623047, 1.0012221336364746, 11000, 9.833884717107196e-05] 2023-02-18 19:36:49,985 32k INFO Saving model and optimizer state at iteration 135 to ./logs\32k\G_11000.pth 2023-02-18 19:37:07,494 32k INFO Saving model and optimizer state at iteration 135 to ./logs\32k\D_11000.pth 2023-02-18 19:39:27,828 32k INFO ====> Epoch: 135 2023-02-18 19:42:33,821 32k INFO ====> Epoch: 136 2023-02-18 19:44:35,721 32k INFO Train Epoch: 137 [59%] 2023-02-18 19:44:35,721 32k INFO [2.6769304275512695, 2.0602264404296875, 9.101190567016602, 14.852888107299805, 0.7094751000404358, 11200, 9.831426399582366e-05] 2023-02-18 19:45:41,712 32k INFO ====> Epoch: 137 2023-02-18 19:48:48,842 32k INFO ====> Epoch: 138 2023-02-18 19:51:55,030 32k INFO ====> Epoch: 139 2023-02-18 19:52:23,208 32k INFO Train Epoch: 140 [2%] 2023-02-18 19:52:23,208 32k INFO [2.416285753250122, 2.515517234802246, 11.539551734924316, 18.85158348083496, 0.7964726090431213, 11400, 9.827740075511432e-05] 2023-02-18 19:55:00,407 32k INFO ====> Epoch: 140 2023-02-18 19:58:07,313 32k INFO ====> Epoch: 141 2023-02-18 19:59:48,759 32k INFO Train Epoch: 142 [46%] 2023-02-18 19:59:48,759 32k INFO [2.6437501907348633, 2.2807607650756836, 7.438623428344727, 16.5651798248291, 0.5293334126472473, 11600, 9.825283294050992e-05] 2023-02-18 20:01:13,320 32k INFO ====> Epoch: 142 2023-02-18 20:04:19,009 32k INFO ====> Epoch: 143 2023-02-18 20:07:08,623 32k INFO Train Epoch: 144 [90%] 2023-02-18 20:07:08,623 32k INFO [2.4082539081573486, 2.348365306854248, 13.17957878112793, 20.185009002685547, 1.1610592603683472, 11800, 9.822827126747529e-05] 2023-02-18 20:07:22,174 32k INFO ====> Epoch: 144 2023-02-18 20:10:24,073 32k INFO ====> Epoch: 145 2023-02-18 20:13:29,961 32k INFO ====> Epoch: 146 2023-02-18 20:14:49,654 32k INFO Train Epoch: 147 [34%] 2023-02-18 20:14:49,655 32k INFO [2.6301674842834473, 2.0582547187805176, 7.999320983886719, 13.03806209564209, 0.6115504503250122, 12000, 9.819144027000834e-05] 2023-02-18 20:14:54,379 32k INFO Saving model and optimizer state at iteration 147 to ./logs\32k\G_12000.pth 2023-02-18 20:15:13,472 32k INFO Saving model and optimizer state at iteration 147 to ./logs\32k\D_12000.pth 2023-02-18 20:16:32,707 32k INFO ====> Epoch: 147 2023-02-18 20:18:51,573 32k INFO ====> Epoch: 148 2023-02-18 20:20:45,616 32k INFO Train Epoch: 149 [78%] 2023-02-18 20:20:45,616 32k INFO [2.5716872215270996, 2.044759511947632, 7.352247714996338, 11.54852294921875, 0.6284887790679932, 12200, 9.816689394418209e-05] 2023-02-18 20:21:09,668 32k INFO ====> Epoch: 149 2023-02-18 20:23:28,296 32k INFO ====> Epoch: 150 2023-02-18 20:25:46,478 32k INFO ====> Epoch: 151 2023-02-18 20:26:35,115 32k INFO Train Epoch: 152 [22%] 2023-02-18 20:26:35,116 32k INFO [2.7496554851531982, 2.115495443344116, 8.033178329467773, 17.09941864013672, 0.5979650020599365, 12400, 9.813008596033443e-05] 2023-02-18 20:28:04,631 32k INFO ====> Epoch: 152 2023-02-18 20:30:22,610 32k INFO ====> Epoch: 153 2023-02-18 20:32:02,056 32k INFO Train Epoch: 154 [66%] 2023-02-18 20:32:02,057 32k INFO [2.5520501136779785, 2.038939952850342, 9.094746589660645, 17.24423599243164, 0.9727638959884644, 12600, 9.810555497212693e-05] 2023-02-18 20:32:40,442 32k INFO ====> Epoch: 154 2023-02-18 20:34:58,936 32k INFO ====> Epoch: 155 2023-02-18 20:37:23,270 32k INFO ====> Epoch: 156 2023-02-18 20:37:58,290 32k INFO Train Epoch: 157 [10%] 2023-02-18 20:37:58,290 32k INFO [2.4157652854919434, 2.302234411239624, 9.094962120056152, 15.25003433227539, 0.7912210822105408, 12800, 9.806876998751865e-05] 2023-02-18 20:39:42,082 32k INFO ====> Epoch: 157 2023-02-18 20:42:09,378 32k INFO ====> Epoch: 158 2023-02-18 20:43:36,431 32k INFO Train Epoch: 159 [54%] 2023-02-18 20:43:36,431 32k INFO [2.303689956665039, 2.5022518634796143, 12.047871589660645, 18.625017166137695, 0.8808897733688354, 13000, 9.804425432734629e-05] 2023-02-18 20:43:40,953 32k INFO Saving model and optimizer state at iteration 159 to ./logs\32k\G_13000.pth 2023-02-18 20:43:58,356 32k INFO Saving model and optimizer state at iteration 159 to ./logs\32k\D_13000.pth 2023-02-18 20:44:54,917 32k INFO ====> Epoch: 159 2023-02-18 20:47:14,493 32k INFO ====> Epoch: 160 2023-02-18 20:49:30,972 32k INFO Train Epoch: 161 [98%] 2023-02-18 20:49:30,973 32k INFO [2.311939001083374, 2.59867000579834, 12.213456153869629, 19.764371871948242, 0.9899132251739502, 13200, 9.801974479570593e-05] 2023-02-18 20:49:32,766 32k INFO ====> Epoch: 161 2023-02-18 20:51:59,478 32k INFO ====> Epoch: 162 2023-02-18 20:55:31,794 32k INFO ====> Epoch: 163 2023-02-18 20:57:30,010 32k INFO Train Epoch: 164 [41%] 2023-02-18 20:57:30,010 32k INFO [2.86376690864563, 1.865500807762146, 5.887643814086914, 11.784518241882324, 1.020143747329712, 13400, 9.798299198589162e-05] 2023-02-18 20:59:20,512 32k INFO ====> Epoch: 164 2023-02-18 21:03:06,854 32k INFO ====> Epoch: 165 2023-02-18 21:06:30,117 32k INFO Train Epoch: 166 [85%] 2023-02-18 21:06:30,117 32k INFO [2.325920343399048, 2.6557977199554443, 11.842923164367676, 20.726858139038086, 0.6900365352630615, 13600, 9.795849776887939e-05] 2023-02-18 21:06:56,246 32k INFO ====> Epoch: 166 2023-02-18 21:10:48,496 32k INFO ====> Epoch: 167 2023-02-18 21:14:47,570 32k INFO ====> Epoch: 168 2023-02-18 21:16:21,507 32k INFO Train Epoch: 169 [29%] 2023-02-18 21:16:21,507 32k INFO [2.5149126052856445, 2.531891345977783, 10.036737442016602, 18.885982513427734, 0.8873259425163269, 13800, 9.792176792382932e-05] 2023-02-18 21:18:35,252 32k INFO ====> Epoch: 169 2023-02-18 21:22:06,753 32k INFO ====> Epoch: 170 2023-02-18 21:24:54,047 32k INFO Train Epoch: 171 [73%] 2023-02-18 21:24:54,048 32k INFO [2.2129878997802734, 2.324878215789795, 15.463290214538574, 19.30773162841797, 0.8498368263244629, 14000, 9.789728901187598e-05] 2023-02-18 21:25:03,662 32k INFO Saving model and optimizer state at iteration 171 to ./logs\32k\G_14000.pth 2023-02-18 21:25:43,764 32k INFO Saving model and optimizer state at iteration 171 to ./logs\32k\D_14000.pth 2023-02-18 21:26:48,302 32k INFO ====> Epoch: 171 2023-02-18 21:30:19,052 32k INFO ====> Epoch: 172 2023-02-18 21:33:46,770 32k INFO ====> Epoch: 173 2023-02-18 21:34:50,019 32k INFO Train Epoch: 174 [17%] 2023-02-18 21:34:50,019 32k INFO [2.4638266563415527, 2.162075996398926, 9.001068115234375, 17.737350463867188, 0.4494396448135376, 14200, 9.786058211724074e-05] 2023-02-18 21:37:15,289 32k INFO ====> Epoch: 174 2023-02-18 21:40:41,023 32k INFO ====> Epoch: 175 2023-02-18 21:43:06,569 32k INFO Train Epoch: 176 [61%] 2023-02-18 21:43:06,570 32k INFO [2.4940695762634277, 2.29329514503479, 9.759041786193848, 15.883288383483887, 0.6593114137649536, 14400, 9.783611850078301e-05] 2023-02-18 21:44:18,062 32k INFO ====> Epoch: 176 2023-02-18 21:47:51,907 32k INFO ====> Epoch: 177 2023-02-18 21:51:42,031 32k INFO ====> Epoch: 178 2023-02-18 21:52:30,304 32k INFO Train Epoch: 179 [5%] 2023-02-18 21:52:30,305 32k INFO [2.24495267868042, 2.5450713634490967, 16.089956283569336, 20.423744201660156, 1.29641854763031, 14600, 9.779943454222217e-05] 2023-02-18 21:55:35,640 32k INFO ====> Epoch: 179 2023-02-18 21:58:29,482 32k INFO ====> Epoch: 180 2023-02-18 22:00:10,005 32k INFO Train Epoch: 181 [49%] 2023-02-18 22:00:10,005 32k INFO [2.0902342796325684, 2.3228907585144043, 15.107324600219727, 20.69207763671875, 0.6706202030181885, 14800, 9.777498621170277e-05] 2023-02-18 22:01:19,615 32k INFO ====> Epoch: 181 2023-02-18 22:04:08,136 32k INFO ====> Epoch: 182 2023-02-18 22:06:51,660 32k INFO Train Epoch: 183 [93%] 2023-02-18 22:06:51,662 32k INFO [2.6548900604248047, 2.4761009216308594, 8.531192779541016, 13.265990257263184, 0.7215445637702942, 15000, 9.7750543992884e-05] 2023-02-18 22:07:00,968 32k INFO Saving model and optimizer state at iteration 183 to ./logs\32k\G_15000.pth 2023-02-18 22:07:34,821 32k INFO Saving model and optimizer state at iteration 183 to ./logs\32k\D_15000.pth 2023-02-18 22:07:54,739 32k INFO ====> Epoch: 183 2023-02-18 22:10:44,685 32k INFO ====> Epoch: 184