2023-02-09 23:54:56,860 32k INFO {'train': {'log_interval': 200, 'eval_interval': 1000, 'seed': 1234, 'epochs': 10000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 6, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 17920, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 384, 'port': '8001'}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 32000, 'filter_length': 1280, 'hop_length': 320, 'win_length': 1280, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': None}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [10, 8, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 256, 'ssl_dim': 256, 'n_speakers': 2}, 'spk': {'aeon': 0}, 'model_dir': './logs\\32k'} 2023-02-09 23:55:16,700 32k INFO Loaded checkpoint './logs\32k\G_0.pth' (iteration 1) 2023-02-09 23:55:20,130 32k INFO Loaded checkpoint './logs\32k\D_0.pth' (iteration 1) 2023-02-09 23:55:47,616 32k INFO Train Epoch: 1 [0%] 2023-02-09 23:55:47,617 32k INFO [2.4760148525238037, 2.3857452869415283, 12.13435173034668, 38.74782943725586, 12.577811241149902, 0, 0.0001] 2023-02-09 23:55:53,842 32k INFO Saving model and optimizer state at iteration 1 to ./logs\32k\G_0.pth 2023-02-09 23:56:10,616 32k INFO Saving model and optimizer state at iteration 1 to ./logs\32k\D_0.pth 2023-02-09 23:57:17,312 32k INFO ====> Epoch: 1 2023-02-09 23:58:41,881 32k INFO ====> Epoch: 2 2023-02-09 23:59:43,565 32k INFO Train Epoch: 3 [63%] 2023-02-09 23:59:43,565 32k INFO [3.0033481121063232, 2.0346643924713135, 9.557120323181152, 17.325883865356445, 1.1444225311279297, 200, 9.99750015625e-05] 2023-02-10 00:00:06,584 32k INFO ====> Epoch: 3 2023-02-10 00:01:30,796 32k INFO ====> Epoch: 4 2023-02-10 00:02:55,178 32k INFO ====> Epoch: 5 2023-02-10 00:03:33,965 32k INFO Train Epoch: 6 [26%] 2023-02-10 00:03:33,965 32k INFO [2.6587367057800293, 2.2601664066314697, 11.51845932006836, 20.874536514282227, 1.2725728750228882, 400, 9.993751562304699e-05] 2023-02-10 00:04:19,925 32k INFO ====> Epoch: 6 2023-02-10 00:05:44,080 32k INFO ====> Epoch: 7 2023-02-10 00:07:02,071 32k INFO Train Epoch: 8 [89%] 2023-02-10 00:07:02,071 32k INFO [2.6835384368896484, 2.130737543106079, 6.842741966247559, 16.306140899658203, 1.2471444606781006, 600, 9.991253280566489e-05] 2023-02-10 00:07:08,670 32k INFO ====> Epoch: 8 2023-02-10 00:08:32,953 32k INFO ====> Epoch: 9 2023-02-10 00:09:58,155 32k INFO ====> Epoch: 10 2023-02-10 00:10:53,121 32k INFO Train Epoch: 11 [53%] 2023-02-10 00:10:53,121 32k INFO [2.508908271789551, 2.226515769958496, 10.416086196899414, 16.404212951660156, 0.9339253306388855, 800, 9.987507028906759e-05] 2023-02-10 00:11:22,660 32k INFO ====> Epoch: 11 2023-02-10 00:12:46,946 32k INFO ====> Epoch: 12 2023-02-10 00:14:11,203 32k INFO ====> Epoch: 13 2023-02-10 00:14:43,324 32k INFO Train Epoch: 14 [16%] 2023-02-10 00:14:43,324 32k INFO [2.2750678062438965, 2.470716714859009, 11.370126724243164, 16.513538360595703, 1.350426197052002, 1000, 9.983762181915804e-05] 2023-02-10 00:14:47,828 32k INFO Saving model and optimizer state at iteration 14 to ./logs\32k\G_1000.pth 2023-02-10 00:15:06,487 32k INFO Saving model and optimizer state at iteration 14 to ./logs\32k\D_1000.pth 2023-02-10 00:16:02,684 32k INFO ====> Epoch: 14 2023-02-10 00:17:27,130 32k INFO ====> Epoch: 15 2023-02-10 00:18:38,765 32k INFO Train Epoch: 16 [79%] 2023-02-10 00:18:38,766 32k INFO [2.3250410556793213, 2.5265045166015625, 12.642312049865723, 18.580968856811523, 1.06544029712677, 1200, 9.981266397366609e-05] 2023-02-10 00:18:51,825 32k INFO ====> Epoch: 16 2023-02-10 00:20:16,238 32k INFO ====> Epoch: 17 2023-02-10 00:21:40,479 32k INFO ====> Epoch: 18 2023-02-10 00:22:29,060 32k INFO Train Epoch: 19 [42%] 2023-02-10 00:22:29,061 32k INFO [2.718174457550049, 2.224525213241577, 9.797636985778809, 13.11706256866455, 0.9313941597938538, 1400, 9.977523890319963e-05] 2023-02-10 00:23:05,152 32k INFO ====> Epoch: 19 2023-02-10 00:24:29,510 32k INFO ====> Epoch: 20 2023-02-10 00:25:53,829 32k INFO ====> Epoch: 21 2023-02-10 00:26:19,301 32k INFO Train Epoch: 22 [5%] 2023-02-10 00:26:19,301 32k INFO [2.4931561946868896, 2.2891364097595215, 5.325507164001465, 14.156054496765137, 0.7020604610443115, 1600, 9.973782786538036e-05] 2023-02-10 00:27:18,459 32k INFO ====> Epoch: 22 2023-02-10 00:28:42,594 32k INFO ====> Epoch: 23 2023-02-10 00:29:47,519 32k INFO Train Epoch: 24 [68%] 2023-02-10 00:29:47,519 32k INFO [2.5088696479797363, 2.6265249252319336, 7.155318737030029, 11.276984214782715, 0.6565865278244019, 1800, 9.971289496681757e-05] 2023-02-10 00:30:07,171 32k INFO ====> Epoch: 24 2023-02-10 00:31:31,461 32k INFO ====> Epoch: 25 2023-02-10 00:32:55,740 32k INFO ====> Epoch: 26 2023-02-10 00:33:37,761 32k INFO Train Epoch: 27 [32%] 2023-02-10 00:33:37,761 32k INFO [2.3180291652679443, 2.524212598800659, 11.95003890991211, 18.545923233032227, 0.6431781053543091, 2000, 9.967550730505221e-05] 2023-02-10 00:33:42,280 32k INFO Saving model and optimizer state at iteration 27 to ./logs\32k\G_2000.pth 2023-02-10 00:33:59,554 32k INFO Saving model and optimizer state at iteration 27 to ./logs\32k\D_2000.pth 2023-02-10 00:34:45,750 32k INFO ====> Epoch: 27 2023-02-10 00:36:10,183 32k INFO ====> Epoch: 28 2023-02-10 00:37:32,059 32k INFO Train Epoch: 29 [95%] 2023-02-10 00:37:32,059 32k INFO [2.533724784851074, 2.297114133834839, 9.978306770324707, 15.051481246948242, 0.5844590067863464, 2200, 9.965058998565574e-05] 2023-02-10 00:37:35,283 32k INFO ====> Epoch: 29 2023-02-10 00:39:00,123 32k INFO ====> Epoch: 30 2023-02-10 00:40:24,839 32k INFO ====> Epoch: 31 2023-02-10 00:41:23,329 32k INFO Train Epoch: 32 [58%] 2023-02-10 00:41:23,329 32k INFO [2.198890447616577, 2.618730068206787, 10.582822799682617, 15.83022689819336, 0.8572500348091125, 2400, 9.961322568533789e-05] 2023-02-10 00:41:49,677 32k INFO ====> Epoch: 32 2023-02-10 00:43:14,179 32k INFO ====> Epoch: 33 2023-02-10 00:44:38,626 32k INFO ====> Epoch: 34 2023-02-10 00:45:14,097 32k INFO Train Epoch: 35 [21%] 2023-02-10 00:45:14,098 32k INFO [2.8067591190338135, 2.2793397903442383, 7.9753804206848145, 11.641331672668457, 1.0224435329437256, 2600, 9.957587539488128e-05] 2023-02-10 00:46:03,586 32k INFO ====> Epoch: 35 2023-02-10 00:47:27,999 32k INFO ====> Epoch: 36 2023-02-10 00:48:42,929 32k INFO Train Epoch: 37 [84%] 2023-02-10 00:48:42,930 32k INFO [2.4459195137023926, 2.123002767562866, 10.907198905944824, 16.264312744140625, 0.6869288682937622, 2800, 9.95509829819056e-05] 2023-02-10 00:48:52,746 32k INFO ====> Epoch: 37 2023-02-10 00:50:17,223 32k INFO ====> Epoch: 38 2023-02-10 00:51:41,864 32k INFO ====> Epoch: 39 2023-02-10 00:52:33,774 32k INFO Train Epoch: 40 [47%] 2023-02-10 00:52:33,774 32k INFO [2.385266065597534, 2.274216651916504, 10.997411727905273, 16.773902893066406, 0.8327657580375671, 3000, 9.951365602954526e-05] 2023-02-10 00:52:38,348 32k INFO Saving model and optimizer state at iteration 40 to ./logs\32k\G_3000.pth 2023-02-10 00:52:57,291 32k INFO Saving model and optimizer state at iteration 40 to ./logs\32k\D_3000.pth 2023-02-10 00:53:33,533 32k INFO ====> Epoch: 40 2023-02-10 00:54:58,115 32k INFO ====> Epoch: 41 2023-02-10 00:56:22,645 32k INFO ====> Epoch: 42 2023-02-10 00:56:51,489 32k INFO Train Epoch: 43 [11%] 2023-02-10 00:56:51,490 32k INFO [2.287508249282837, 2.692063331604004, 9.70651626586914, 16.967220306396484, 0.798807680606842, 3200, 9.947634307304244e-05] 2023-02-10 00:57:47,480 32k INFO ====> Epoch: 43 2023-02-10 00:59:12,119 32k INFO ====> Epoch: 44 2023-02-10 01:00:20,508 32k INFO Train Epoch: 45 [74%] 2023-02-10 01:00:20,509 32k INFO [2.249314785003662, 2.683621883392334, 9.802401542663574, 17.789817810058594, 0.4752405881881714, 3400, 9.945147554159202e-05] 2023-02-10 01:00:36,926 32k INFO ====> Epoch: 45 2023-02-10 01:02:01,330 32k INFO ====> Epoch: 46 2023-02-10 01:03:25,782 32k INFO ====> Epoch: 47 2023-02-10 01:04:11,005 32k INFO Train Epoch: 48 [37%] 2023-02-10 01:04:11,006 32k INFO [2.2856054306030273, 2.5306761264801025, 9.405101776123047, 14.461358070373535, 0.6637941002845764, 3600, 9.941418589985758e-05] 2023-02-10 01:04:50,618 32k INFO ====> Epoch: 48 2023-02-10 01:06:16,153 32k INFO ====> Epoch: 49 2023-02-10 01:07:40,668 32k INFO ====> Epoch: 50 2023-02-10 01:08:02,827 32k INFO Train Epoch: 51 [0%] 2023-02-10 01:08:02,827 32k INFO [2.449678659439087, 2.3578238487243652, 10.553171157836914, 16.373308181762695, 1.1569361686706543, 3800, 9.937691023999092e-05] 2023-02-10 01:09:05,476 32k INFO ====> Epoch: 51 2023-02-10 01:10:29,887 32k INFO ====> Epoch: 52 2023-02-10 01:11:31,703 32k INFO Train Epoch: 53 [63%] 2023-02-10 01:11:31,703 32k INFO [2.4566311836242676, 2.3078484535217285, 9.267095565795898, 14.384586334228516, 0.9955094456672668, 4000, 9.935206756519513e-05] 2023-02-10 01:11:36,151 32k INFO Saving model and optimizer state at iteration 53 to ./logs\32k\G_4000.pth 2023-02-10 01:11:53,677 32k INFO Saving model and optimizer state at iteration 53 to ./logs\32k\D_4000.pth 2023-02-10 01:12:20,483 32k INFO ====> Epoch: 53 2023-02-10 01:13:45,174 32k INFO ====> Epoch: 54 2023-02-10 01:15:09,860 32k INFO ====> Epoch: 55 2023-02-10 01:15:48,458 32k INFO Train Epoch: 56 [26%] 2023-02-10 01:15:48,458 32k INFO [2.2449021339416504, 2.2987923622131348, 10.592050552368164, 14.990753173828125, 0.2585275173187256, 4200, 9.931481519679228e-05] 2023-02-10 01:16:34,690 32k INFO ====> Epoch: 56 2023-02-10 01:17:59,199 32k INFO ====> Epoch: 57 2023-02-10 01:19:17,517 32k INFO Train Epoch: 58 [89%] 2023-02-10 01:19:17,518 32k INFO [2.7950096130371094, 2.033355236053467, 6.349589824676514, 13.051651000976562, 0.41371816396713257, 4400, 9.928998804478705e-05] 2023-02-10 01:19:24,069 32k INFO ====> Epoch: 58 2023-02-10 01:20:48,652 32k INFO ====> Epoch: 59 2023-02-10 01:22:13,434 32k INFO ====> Epoch: 60 2023-02-10 01:23:08,515 32k INFO Train Epoch: 61 [53%] 2023-02-10 01:23:08,516 32k INFO [2.509904146194458, 2.3118839263916016, 10.964993476867676, 18.20663833618164, 0.8472650647163391, 4600, 9.92527589532945e-05] 2023-02-10 01:23:38,264 32k INFO ====> Epoch: 61 2023-02-10 01:25:02,945 32k INFO ====> Epoch: 62 2023-02-10 01:26:27,685 32k INFO ====> Epoch: 63 2023-02-10 01:26:59,803 32k INFO Train Epoch: 64 [16%] 2023-02-10 01:26:59,803 32k INFO [2.302783489227295, 2.5982413291931152, 10.80894660949707, 14.60811996459961, 1.0399219989776611, 4800, 9.921554382096622e-05] 2023-02-10 01:27:52,584 32k INFO ====> Epoch: 64 2023-02-10 01:29:17,080 32k INFO ====> Epoch: 65 2023-02-10 01:30:28,770 32k INFO Train Epoch: 66 [79%] 2023-02-10 01:30:28,770 32k INFO [2.7284963130950928, 2.3790788650512695, 9.036128044128418, 13.677116394042969, 0.6173059940338135, 5000, 9.919074148525384e-05] 2023-02-10 01:30:33,203 32k INFO Saving model and optimizer state at iteration 66 to ./logs\32k\G_5000.pth 2023-02-10 01:30:50,085 32k INFO Saving model and optimizer state at iteration 66 to ./logs\32k\D_5000.pth 2023-02-10 01:31:06,739 32k INFO ====> Epoch: 66 2023-02-10 01:32:31,353 32k INFO ====> Epoch: 67 2023-02-10 01:33:57,529 32k INFO ====> Epoch: 68 2023-02-10 01:34:46,261 32k INFO Train Epoch: 69 [42%] 2023-02-10 01:34:46,262 32k INFO [2.5467963218688965, 2.0986404418945312, 10.101814270019531, 14.508784294128418, 1.3656277656555176, 5200, 9.915354960656915e-05] 2023-02-10 01:35:22,514 32k INFO ====> Epoch: 69 2023-02-10 01:36:47,135 32k INFO ====> Epoch: 70 2023-02-10 01:38:11,576 32k INFO ====> Epoch: 71 2023-02-10 01:38:37,123 32k INFO Train Epoch: 72 [5%] 2023-02-10 01:38:37,123 32k INFO [1.857373833656311, 3.1599011421203613, 8.379103660583496, 9.244300842285156, 1.2350279092788696, 5400, 9.911637167309565e-05] 2023-02-10 01:39:36,388 32k INFO ====> Epoch: 72 2023-02-10 01:41:00,864 32k INFO ====> Epoch: 73 2023-02-10 01:42:05,899 32k INFO Train Epoch: 74 [68%] 2023-02-10 01:42:05,900 32k INFO [2.3246712684631348, 2.638263463973999, 7.793229579925537, 13.017571449279785, 0.9624058604240417, 5600, 9.909159412887068e-05] 2023-02-10 01:42:25,722 32k INFO ====> Epoch: 74 2023-02-10 01:43:50,155 32k INFO ====> Epoch: 75 2023-02-10 01:45:14,883 32k INFO ====> Epoch: 76 2023-02-10 01:45:56,818 32k INFO Train Epoch: 77 [32%] 2023-02-10 01:45:56,818 32k INFO [2.2799293994903564, 2.503884792327881, 12.849569320678711, 18.74350929260254, 0.9742995500564575, 5800, 9.905443942579728e-05] 2023-02-10 01:46:39,621 32k INFO ====> Epoch: 77 2023-02-10 01:48:04,267 32k INFO ====> Epoch: 78 2023-02-10 01:49:25,876 32k INFO Train Epoch: 79 [95%] 2023-02-10 01:49:25,876 32k INFO [2.3028907775878906, 2.451934337615967, 11.012730598449707, 17.174062728881836, 0.7448775768280029, 6000, 9.902967736366644e-05] 2023-02-10 01:49:30,404 32k INFO Saving model and optimizer state at iteration 79 to ./logs\32k\G_6000.pth 2023-02-10 01:49:47,910 32k INFO Saving model and optimizer state at iteration 79 to ./logs\32k\D_6000.pth 2023-02-10 01:49:54,757 32k INFO ====> Epoch: 79 2023-02-10 01:51:19,458 32k INFO ====> Epoch: 80 2023-02-10 01:52:44,119 32k INFO ====> Epoch: 81 2023-02-10 01:53:42,673 32k INFO Train Epoch: 82 [58%] 2023-02-10 01:53:42,674 32k INFO [2.279362201690674, 2.260122776031494, 8.51861572265625, 12.269641876220703, 0.7776111960411072, 6200, 9.899254587647776e-05] 2023-02-10 01:54:09,077 32k INFO ====> Epoch: 82 2023-02-10 01:55:33,754 32k INFO ====> Epoch: 83 2023-02-10 01:56:58,300 32k INFO ====> Epoch: 84 2023-02-10 01:57:33,593 32k INFO Train Epoch: 85 [21%] 2023-02-10 01:57:33,594 32k INFO [2.6448638439178467, 2.2926855087280273, 11.688180923461914, 18.19353675842285, 0.9365038275718689, 6400, 9.895542831185631e-05] 2023-02-10 01:58:23,117 32k INFO ====> Epoch: 85 2023-02-10 01:59:47,507 32k INFO ====> Epoch: 86 2023-02-10 02:01:02,283 32k INFO Train Epoch: 87 [84%] 2023-02-10 02:01:02,283 32k INFO [2.427564859390259, 2.242896556854248, 11.131178855895996, 16.051637649536133, 0.914648711681366, 6600, 9.89306910009569e-05] 2023-02-10 02:01:12,207 32k INFO ====> Epoch: 87 2023-02-10 02:02:36,685 32k INFO ====> Epoch: 88 2023-02-10 02:04:01,181 32k INFO ====> Epoch: 89 2023-02-10 02:04:52,920 32k INFO Train Epoch: 90 [47%] 2023-02-10 02:04:52,921 32k INFO [2.4379117488861084, 2.36348557472229, 8.932344436645508, 16.195209503173828, 0.8471890091896057, 6800, 9.889359662901445e-05] 2023-02-10 02:05:25,823 32k INFO ====> Epoch: 90 2023-02-10 02:06:50,173 32k INFO ====> Epoch: 91 2023-02-10 02:08:14,749 32k INFO ====> Epoch: 92 2023-02-10 02:08:43,328 32k INFO Train Epoch: 93 [11%] 2023-02-10 02:08:43,329 32k INFO [2.467284917831421, 2.144986391067505, 11.449904441833496, 16.176807403564453, 0.4319636821746826, 7000, 9.885651616572276e-05] 2023-02-10 02:08:47,768 32k INFO Saving model and optimizer state at iteration 93 to ./logs\32k\G_7000.pth 2023-02-10 02:09:04,812 32k INFO Saving model and optimizer state at iteration 93 to ./logs\32k\D_7000.pth 2023-02-10 02:10:04,543 32k INFO ====> Epoch: 93 2023-02-10 02:11:29,241 32k INFO ====> Epoch: 94 2023-02-10 02:12:37,720 32k INFO Train Epoch: 95 [74%] 2023-02-10 02:12:37,720 32k INFO [2.2731475830078125, 2.425633430480957, 10.031761169433594, 14.7035493850708, 0.8044810891151428, 7200, 9.883180358131438e-05] 2023-02-10 02:12:54,162 32k INFO ====> Epoch: 95 2023-02-10 02:14:18,638 32k INFO ====> Epoch: 96 2023-02-10 02:15:42,921 32k INFO ====> Epoch: 97 2023-02-10 02:16:28,193 32k INFO Train Epoch: 98 [37%] 2023-02-10 02:16:28,194 32k INFO [2.3223936557769775, 2.573565721511841, 10.411147117614746, 19.68339729309082, 0.5216219425201416, 7400, 9.879474628751914e-05] 2023-02-10 02:17:07,685 32k INFO ====> Epoch: 98 2023-02-10 02:18:32,253 32k INFO ====> Epoch: 99 2023-02-10 02:19:56,642 32k INFO ====> Epoch: 100 2023-02-10 02:20:18,741 32k INFO Train Epoch: 101 [0%] 2023-02-10 02:20:18,742 32k INFO [2.397695541381836, 2.1934003829956055, 8.183572769165039, 12.770599365234375, 1.0327764749526978, 7600, 9.875770288847208e-05] 2023-02-10 02:21:21,459 32k INFO ====> Epoch: 101 2023-02-10 02:22:46,182 32k INFO ====> Epoch: 102 2023-02-10 02:23:47,848 32k INFO Train Epoch: 103 [63%] 2023-02-10 02:23:47,848 32k INFO [2.461080312728882, 2.3870487213134766, 6.649186134338379, 10.037254333496094, 0.8956831693649292, 7800, 9.873301500583906e-05] 2023-02-10 02:24:10,873 32k INFO ====> Epoch: 103 2023-02-10 02:25:35,259 32k INFO ====> Epoch: 104 2023-02-10 02:26:59,649 32k INFO ====> Epoch: 105 2023-02-10 02:27:38,326 32k INFO Train Epoch: 106 [26%] 2023-02-10 02:27:38,327 32k INFO [1.846121072769165, 3.3351776599884033, 8.606220245361328, 9.130136489868164, 0.8224015831947327, 8000, 9.86959947531291e-05] 2023-02-10 02:27:42,852 32k INFO Saving model and optimizer state at iteration 106 to ./logs\32k\G_8000.pth 2023-02-10 02:28:02,191 32k INFO Saving model and optimizer state at iteration 106 to ./logs\32k\D_8000.pth 2023-02-10 02:28:52,049 32k INFO ====> Epoch: 106 2023-02-10 02:30:17,599 32k INFO ====> Epoch: 107 2023-02-10 02:31:35,925 32k INFO Train Epoch: 108 [89%] 2023-02-10 02:31:35,925 32k INFO [2.4512813091278076, 2.255336046218872, 7.023709297180176, 11.943727493286133, 0.9954293370246887, 8200, 9.867132229656573e-05] 2023-02-10 02:31:42,471 32k INFO ====> Epoch: 108 2023-02-10 02:33:07,180 32k INFO ====> Epoch: 109 2023-02-10 02:34:31,690 32k INFO ====> Epoch: 110 2023-02-10 02:35:26,947 32k INFO Train Epoch: 111 [53%] 2023-02-10 02:35:26,947 32k INFO [2.391657590866089, 2.3984644412994385, 13.551353454589844, 14.703734397888184, 0.9091495871543884, 8400, 9.863432517573002e-05] 2023-02-10 02:35:56,571 32k INFO ====> Epoch: 111 2023-02-10 02:37:21,163 32k INFO ====> Epoch: 112 2023-02-10 02:38:45,688 32k INFO ====> Epoch: 113 2023-02-10 02:39:17,654 32k INFO Train Epoch: 114 [16%] 2023-02-10 02:39:17,655 32k INFO [2.39654278755188, 2.245805263519287, 10.06233024597168, 14.09300708770752, 0.6480247974395752, 8600, 9.859734192708044e-05] 2023-02-10 02:40:10,405 32k INFO ====> Epoch: 114 2023-02-10 02:41:34,622 32k INFO ====> Epoch: 115 2023-02-10 02:42:46,274 32k INFO Train Epoch: 116 [79%] 2023-02-10 02:42:46,274 32k INFO [2.5749049186706543, 2.414797306060791, 9.960501670837402, 16.4396915435791, 0.789290189743042, 8800, 9.857269413218213e-05] 2023-02-10 02:42:59,471 32k INFO ====> Epoch: 116 2023-02-10 02:44:24,046 32k INFO ====> Epoch: 117 2023-02-10 02:45:48,517 32k INFO ====> Epoch: 118 2023-02-10 02:46:37,109 32k INFO Train Epoch: 119 [42%] 2023-02-10 02:46:37,109 32k INFO [2.4546942710876465, 2.1338768005371094, 12.899450302124023, 15.397659301757812, 0.8034476637840271, 9000, 9.853573399228505e-05] 2023-02-10 02:46:41,546 32k INFO Saving model and optimizer state at iteration 119 to ./logs\32k\G_9000.pth 2023-02-10 02:47:01,129 32k INFO Saving model and optimizer state at iteration 119 to ./logs\32k\D_9000.pth 2023-02-10 02:47:40,886 32k INFO ====> Epoch: 119 2023-02-10 02:49:05,371 32k INFO ====> Epoch: 120 2023-02-10 02:50:29,859 32k INFO ====> Epoch: 121 2023-02-10 02:50:55,463 32k INFO Train Epoch: 122 [5%] 2023-02-10 02:50:55,463 32k INFO [2.207282543182373, 2.6213908195495605, 7.647693634033203, 14.46692180633545, 0.9529262185096741, 9200, 9.8498787710708e-05] 2023-02-10 02:51:55,210 32k INFO ====> Epoch: 122 2023-02-10 02:53:19,953 32k INFO ====> Epoch: 123 2023-02-10 02:54:25,092 32k INFO Train Epoch: 124 [68%] 2023-02-10 02:54:25,093 32k INFO [2.332580089569092, 2.3436594009399414, 6.139850616455078, 10.373167037963867, 0.9625253677368164, 9400, 9.847416455282387e-05] 2023-02-10 02:54:44,808 32k INFO ====> Epoch: 124 2023-02-10 02:56:09,403 32k INFO ====> Epoch: 125 2023-02-10 02:57:35,700 32k INFO ====> Epoch: 126 2023-02-10 02:58:17,551 32k INFO Train Epoch: 127 [32%] 2023-02-10 02:58:17,552 32k INFO [2.467034339904785, 2.2089598178863525, 7.952523708343506, 16.058887481689453, 0.8524291515350342, 9600, 9.84372413569007e-05] 2023-02-10 02:59:00,413 32k INFO ====> Epoch: 127 2023-02-10 03:00:24,954 32k INFO ====> Epoch: 128 2023-02-10 03:01:46,396 32k INFO Train Epoch: 129 [95%] 2023-02-10 03:01:46,397 32k INFO [2.367544174194336, 2.7189576625823975, 6.78631591796875, 12.748686790466309, 0.4685216546058655, 9800, 9.841263358464336e-05] 2023-02-10 03:01:49,616 32k INFO ====> Epoch: 129 2023-02-10 03:03:14,271 32k INFO ====> Epoch: 130 2023-02-10 03:04:38,789 32k INFO ====> Epoch: 131 2023-02-10 03:05:37,230 32k INFO Train Epoch: 132 [58%] 2023-02-10 03:05:37,231 32k INFO [2.4509665966033936, 2.7042665481567383, 10.243302345275879, 13.67796802520752, 0.6009257435798645, 10000, 9.837573345994909e-05] 2023-02-10 03:05:41,749 32k INFO Saving model and optimizer state at iteration 132 to ./logs\32k\G_10000.pth 2023-02-10 03:05:58,430 32k INFO Saving model and optimizer state at iteration 132 to ./logs\32k\D_10000.pth 2023-02-10 03:06:28,668 32k INFO ====> Epoch: 132 2023-02-10 03:07:54,369 32k INFO ====> Epoch: 133 2023-02-10 03:09:19,084 32k INFO ====> Epoch: 134 2023-02-10 03:09:54,319 32k INFO Train Epoch: 135 [21%] 2023-02-10 03:09:54,319 32k INFO [2.5057411193847656, 2.4407787322998047, 9.817587852478027, 13.728426933288574, 0.6413305997848511, 10200, 9.833884717107196e-05] 2023-02-10 03:10:43,813 32k INFO ====> Epoch: 135 2023-02-10 03:12:08,539 32k INFO ====> Epoch: 136 2023-02-10 03:13:23,629 32k INFO Train Epoch: 137 [84%] 2023-02-10 03:13:23,629 32k INFO [2.5296790599823, 2.208813190460205, 6.762031078338623, 10.408514976501465, 0.754698634147644, 10400, 9.831426399582366e-05] 2023-02-10 03:13:33,495 32k INFO ====> Epoch: 137 2023-02-10 03:14:58,219 32k INFO ====> Epoch: 138 2023-02-10 03:16:22,869 32k INFO ====> Epoch: 139 2023-02-10 03:17:14,809 32k INFO Train Epoch: 140 [47%] 2023-02-10 03:17:14,809 32k INFO [2.5353844165802, 2.2106950283050537, 10.004817008972168, 13.992541313171387, 0.6150087714195251, 10600, 9.827740075511432e-05] 2023-02-10 03:17:47,789 32k INFO ====> Epoch: 140 2023-02-10 03:19:12,269 32k INFO ====> Epoch: 141 2023-02-10 03:20:36,797 32k INFO ====> Epoch: 142 2023-02-10 03:21:05,474 32k INFO Train Epoch: 143 [11%] 2023-02-10 03:21:05,475 32k INFO [2.335813522338867, 2.381714344024658, 9.432477951049805, 14.522370338439941, 0.697417140007019, 10800, 9.824055133639235e-05] 2023-02-10 03:22:01,624 32k INFO ====> Epoch: 143 2023-02-10 03:23:26,232 32k INFO ====> Epoch: 144 2023-02-10 03:24:34,537 32k INFO Train Epoch: 145 [74%] 2023-02-10 03:24:34,538 32k INFO [1.924493670463562, 3.025908946990967, 10.635231018066406, 12.46295166015625, 0.3505359888076782, 11000, 9.821599273356685e-05] 2023-02-10 03:24:39,057 32k INFO Saving model and optimizer state at iteration 145 to ./logs\32k\G_11000.pth 2023-02-10 03:24:57,871 32k INFO Saving model and optimizer state at iteration 145 to ./logs\32k\D_11000.pth 2023-02-10 03:25:18,230 32k INFO ====> Epoch: 145 2023-02-10 03:26:42,920 32k INFO ====> Epoch: 146 2023-02-10 03:28:07,571 32k INFO ====> Epoch: 147 2023-02-10 03:28:52,871 32k INFO Train Epoch: 148 [37%] 2023-02-10 03:28:52,872 32k INFO [2.568720579147339, 2.2338619232177734, 6.399820327758789, 12.950323104858398, 0.9168809056282043, 11200, 9.817916633997459e-05] 2023-02-10 03:29:32,512 32k INFO ====> Epoch: 148 2023-02-10 03:30:57,311 32k INFO ====> Epoch: 149 2023-02-10 03:32:22,084 32k INFO ====> Epoch: 150 2023-02-10 03:32:44,293 32k INFO Train Epoch: 151 [0%] 2023-02-10 03:32:44,293 32k INFO [2.3033745288848877, 2.415085792541504, 10.928938865661621, 16.1735897064209, 0.3847680985927582, 11400, 9.814235375455375e-05] 2023-02-10 03:33:47,215 32k INFO ====> Epoch: 151 2023-02-10 03:35:11,867 32k INFO ====> Epoch: 152 2023-02-10 03:36:13,752 32k INFO Train Epoch: 153 [63%] 2023-02-10 03:36:13,753 32k INFO [2.173736333847046, 2.650914192199707, 12.767165184020996, 18.795696258544922, 0.4320840835571289, 11600, 9.811781969958938e-05] 2023-02-10 03:36:36,873 32k INFO ====> Epoch: 153 2023-02-10 03:38:01,486 32k INFO ====> Epoch: 154 2023-02-10 03:39:26,006 32k INFO ====> Epoch: 155 2023-02-10 03:40:04,581 32k INFO Train Epoch: 156 [26%] 2023-02-10 03:40:04,582 32k INFO [2.574475049972534, 2.363281488418579, 8.882105827331543, 10.567209243774414, 0.11711447685956955, 11800, 9.808103011628319e-05] 2023-02-10 03:40:50,861 32k INFO ====> Epoch: 156 2023-02-10 03:42:15,437 32k INFO ====> Epoch: 157 2023-02-10 03:43:33,696 32k INFO Train Epoch: 158 [89%] 2023-02-10 03:43:33,696 32k INFO [2.343777894973755, 2.4922895431518555, 10.256402969360352, 16.83711051940918, 0.874286413192749, 12000, 9.80565113912702e-05] 2023-02-10 03:43:38,306 32k INFO Saving model and optimizer state at iteration 158 to ./logs\32k\G_12000.pth 2023-02-10 03:43:55,136 32k INFO Saving model and optimizer state at iteration 158 to ./logs\32k\D_12000.pth 2023-02-10 03:44:05,409 32k INFO ====> Epoch: 158 2023-02-10 03:45:30,201 32k INFO ====> Epoch: 159 2023-02-10 03:46:54,691 32k INFO ====> Epoch: 160 2023-02-10 03:47:49,976 32k INFO Train Epoch: 161 [53%] 2023-02-10 03:47:49,976 32k INFO [2.596492290496826, 2.6639647483825684, 7.916460037231445, 15.857110977172852, 0.8963372707366943, 12200, 9.801974479570593e-05] 2023-02-10 03:48:19,643 32k INFO ====> Epoch: 161 2023-02-10 03:49:44,342 32k INFO ====> Epoch: 162 2023-02-10 03:51:09,047 32k INFO ====> Epoch: 163 2023-02-10 03:51:41,212 32k INFO Train Epoch: 164 [16%] 2023-02-10 03:51:41,212 32k INFO [2.252506971359253, 2.4695920944213867, 11.395182609558105, 16.45693016052246, 0.94123774766922, 12400, 9.798299198589162e-05] 2023-02-10 03:52:34,132 32k INFO ====> Epoch: 164 2023-02-10 03:53:59,347 32k INFO ====> Epoch: 165 2023-02-10 03:55:11,032 32k INFO Train Epoch: 166 [79%] 2023-02-10 03:55:11,032 32k INFO [2.5481297969818115, 2.5002939701080322, 10.718299865722656, 17.71586036682129, 0.3888394236564636, 12600, 9.795849776887939e-05] 2023-02-10 03:55:24,184 32k INFO ====> Epoch: 166 2023-02-10 03:56:48,762 32k INFO ====> Epoch: 167 2023-02-10 03:58:13,310 32k INFO ====> Epoch: 168 2023-02-10 03:59:01,779 32k INFO Train Epoch: 169 [42%] 2023-02-10 03:59:01,779 32k INFO [2.531505584716797, 1.924519658088684, 13.503902435302734, 16.02785301208496, 0.9448908567428589, 12800, 9.792176792382932e-05] 2023-02-10 03:59:38,066 32k INFO ====> Epoch: 169 2023-02-10 04:01:02,587 32k INFO ====> Epoch: 170 2023-02-10 04:02:27,100 32k INFO ====> Epoch: 171 2023-02-10 04:02:52,429 32k INFO Train Epoch: 172 [5%] 2023-02-10 04:02:52,430 32k INFO [2.5668065547943115, 2.419389247894287, 8.094529151916504, 14.7818603515625, 0.7867487072944641, 13000, 9.78850518507495e-05] 2023-02-10 04:02:56,916 32k INFO Saving model and optimizer state at iteration 172 to ./logs\32k\G_13000.pth 2023-02-10 04:03:14,363 32k INFO Saving model and optimizer state at iteration 172 to ./logs\32k\D_13000.pth 2023-02-10 04:04:17,632 32k INFO ====> Epoch: 172 2023-02-10 04:05:42,306 32k INFO ====> Epoch: 173 2023-02-10 04:06:47,480 32k INFO Train Epoch: 174 [68%] 2023-02-10 04:06:47,480 32k INFO [2.1723153591156006, 2.5889978408813477, 10.076593399047852, 12.853192329406738, 0.3051690459251404, 13200, 9.786058211724074e-05] 2023-02-10 04:07:07,239 32k INFO ====> Epoch: 174 2023-02-10 04:08:31,852 32k INFO ====> Epoch: 175 2023-02-10 04:09:56,580 32k INFO ====> Epoch: 176 2023-02-10 04:10:38,557 32k INFO Train Epoch: 177 [32%] 2023-02-10 04:10:38,558 32k INFO [2.395698070526123, 2.4369287490844727, 11.675952911376953, 15.961197853088379, 0.7354399561882019, 13400, 9.782388898597041e-05] 2023-02-10 04:11:21,538 32k INFO ====> Epoch: 177 2023-02-10 04:12:46,203 32k INFO ====> Epoch: 178 2023-02-10 04:14:07,776 32k INFO Train Epoch: 179 [95%] 2023-02-10 04:14:07,777 32k INFO [2.7345943450927734, 1.9628326892852783, 6.206479072570801, 7.970531940460205, 0.8745248913764954, 13600, 9.779943454222217e-05] 2023-02-10 04:14:10,989 32k INFO ====> Epoch: 179 2023-02-10 04:15:35,842 32k INFO ====> Epoch: 180 2023-02-10 04:17:00,411 32k INFO ====> Epoch: 181 2023-02-10 04:17:58,909 32k INFO Train Epoch: 182 [58%] 2023-02-10 04:17:58,909 32k INFO [2.2180511951446533, 2.461686372756958, 7.78315544128418, 12.147871971130371, 0.7892422080039978, 13800, 9.776276433842631e-05] 2023-02-10 04:18:25,267 32k INFO ====> Epoch: 182 2023-02-10 04:19:49,727 32k INFO ====> Epoch: 183 2023-02-10 04:21:14,382 32k INFO ====> Epoch: 184 2023-02-10 04:21:49,851 32k INFO Train Epoch: 185 [21%] 2023-02-10 04:21:49,851 32k INFO [2.5905978679656982, 2.213409185409546, 9.200039863586426, 12.909902572631836, 0.1616513580083847, 14000, 9.772610788423802e-05] 2023-02-10 04:21:54,470 32k INFO Saving model and optimizer state at iteration 185 to ./logs\32k\G_14000.pth 2023-02-10 04:22:11,052 32k INFO Saving model and optimizer state at iteration 185 to ./logs\32k\D_14000.pth 2023-02-10 04:23:03,696 32k INFO ====> Epoch: 185 2023-02-10 04:24:28,374 32k INFO ====> Epoch: 186 2023-02-10 04:25:43,503 32k INFO Train Epoch: 187 [84%] 2023-02-10 04:25:43,504 32k INFO [2.115373134613037, 2.6405253410339355, 13.0419340133667, 17.158859252929688, 0.6063501238822937, 14200, 9.77016778842374e-05] 2023-02-10 04:25:53,337 32k INFO ====> Epoch: 187 2023-02-10 04:27:18,024 32k INFO ====> Epoch: 188 2023-02-10 04:28:42,529 32k INFO ====> Epoch: 189 2023-02-10 04:29:34,368 32k INFO Train Epoch: 190 [47%] 2023-02-10 04:29:34,368 32k INFO [2.0680086612701416, 3.0300402641296387, 8.614965438842773, 12.649006843566895, 0.691765546798706, 14400, 9.766504433460612e-05] 2023-02-10 04:30:07,450 32k INFO ====> Epoch: 190 2023-02-10 04:31:32,088 32k INFO ====> Epoch: 191 2023-02-10 04:32:56,573 32k INFO ====> Epoch: 192 2023-02-10 04:33:25,364 32k INFO Train Epoch: 193 [11%] 2023-02-10 04:33:25,365 32k INFO [2.3075990676879883, 2.3698666095733643, 14.287137985229492, 16.29629135131836, 0.5351936221122742, 14600, 9.762842452083883e-05] 2023-02-10 04:34:21,439 32k INFO ====> Epoch: 193 2023-02-10 04:35:46,041 32k INFO ====> Epoch: 194 2023-02-10 04:36:54,340 32k INFO Train Epoch: 195 [74%] 2023-02-10 04:36:54,341 32k INFO [2.3796615600585938, 2.2257769107818604, 9.005014419555664, 12.171199798583984, 0.7831335067749023, 14800, 9.760401894015275e-05] 2023-02-10 04:37:10,784 32k INFO ====> Epoch: 195 2023-02-10 04:38:35,380 32k INFO ====> Epoch: 196 2023-02-10 04:39:59,882 32k INFO ====> Epoch: 197 2023-02-10 04:40:45,368 32k INFO Train Epoch: 198 [37%] 2023-02-10 04:40:45,368 32k INFO [2.5116686820983887, 2.4769821166992188, 9.495701789855957, 15.055347442626953, 0.6151214241981506, 15000, 9.756742200804793e-05] 2023-02-10 04:40:49,881 32k INFO Saving model and optimizer state at iteration 198 to ./logs\32k\G_15000.pth 2023-02-10 04:41:08,000 32k INFO Saving model and optimizer state at iteration 198 to ./logs\32k\D_15000.pth 2023-02-10 04:41:51,223 32k INFO ====> Epoch: 198 2023-02-10 04:43:15,797 32k INFO ====> Epoch: 199 2023-02-10 04:44:40,469 32k INFO ====> Epoch: 200 2023-02-10 04:45:02,531 32k INFO Train Epoch: 201 [0%] 2023-02-10 04:45:02,531 32k INFO [2.66977858543396, 2.327565908432007, 8.785197257995605, 12.054095268249512, 0.33490169048309326, 15200, 9.753083879807726e-05] 2023-02-10 04:46:05,243 32k INFO ====> Epoch: 201 2023-02-10 04:47:29,831 32k INFO ====> Epoch: 202 2023-02-10 04:48:31,669 32k INFO Train Epoch: 203 [63%] 2023-02-10 04:48:31,669 32k INFO [2.4113712310791016, 2.208491325378418, 7.582664966583252, 10.762511253356934, 0.24080340564250946, 15400, 9.750645761229709e-05] 2023-02-10 04:48:54,843 32k INFO ====> Epoch: 203 2023-02-10 04:50:20,441 32k INFO ====> Epoch: 204 2023-02-10 04:51:44,905 32k INFO ====> Epoch: 205 2023-02-10 04:52:23,595 32k INFO Train Epoch: 206 [26%] 2023-02-10 04:52:23,596 32k INFO [2.406938076019287, 2.329049587249756, 13.670656204223633, 15.57223129272461, 0.5972563028335571, 15600, 9.746989726111722e-05] 2023-02-10 04:53:09,786 32k INFO ====> Epoch: 206 2023-02-10 04:54:34,352 32k INFO ====> Epoch: 207 2023-02-10 04:55:52,644 32k INFO Train Epoch: 208 [89%] 2023-02-10 04:55:52,645 32k INFO [2.4009199142456055, 2.4792189598083496, 8.647868156433105, 13.368301391601562, 0.6690515875816345, 15800, 9.744553130976908e-05] 2023-02-10 04:55:59,209 32k INFO ====> Epoch: 208 2023-02-10 04:57:23,732 32k INFO ====> Epoch: 209 2023-02-10 04:58:48,239 32k INFO ====> Epoch: 210 2023-02-10 04:59:43,567 32k INFO Train Epoch: 211 [53%] 2023-02-10 04:59:43,567 32k INFO [2.491450309753418, 2.5477209091186523, 7.471154689788818, 13.8217191696167, 0.5033202767372131, 16000, 9.740899380309685e-05] 2023-02-10 04:59:47,998 32k INFO Saving model and optimizer state at iteration 211 to ./logs\32k\G_16000.pth 2023-02-10 05:00:05,780 32k INFO Saving model and optimizer state at iteration 211 to ./logs\32k\D_16000.pth 2023-02-10 05:00:39,185 32k INFO ====> Epoch: 211 2023-02-10 05:02:04,575 32k INFO ====> Epoch: 212 2023-02-10 05:03:29,926 32k INFO ====> Epoch: 213 2023-02-10 05:04:02,744 32k INFO Train Epoch: 214 [16%] 2023-02-10 05:04:02,744 32k INFO [2.5403432846069336, 2.3642735481262207, 10.938908576965332, 14.273392677307129, 0.46957525610923767, 16200, 9.7372469996277e-05] 2023-02-10 05:04:55,587 32k INFO ====> Epoch: 214 2023-02-10 05:06:20,137 32k INFO ====> Epoch: 215 2023-02-10 05:07:31,842 32k INFO Train Epoch: 216 [79%] 2023-02-10 05:07:31,842 32k INFO [2.339552879333496, 2.709596872329712, 11.353394508361816, 18.01609992980957, 0.5840747356414795, 16400, 9.734812840022278e-05] 2023-02-10 05:07:45,003 32k INFO ====> Epoch: 216 2023-02-10 05:09:10,461 32k INFO ====> Epoch: 217 2023-02-10 05:10:35,839 32k INFO ====> Epoch: 218 2023-02-10 05:11:24,387 32k INFO Train Epoch: 219 [42%] 2023-02-10 05:11:24,387 32k INFO [2.4308414459228516, 2.6758008003234863, 9.025983810424805, 12.856972694396973, 0.7895928621292114, 16600, 9.731162741507607e-05] 2023-02-10 05:12:00,773 32k INFO ====> Epoch: 219 2023-02-10 05:13:26,131 32k INFO ====> Epoch: 220 2023-02-10 05:14:50,815 32k INFO ====> Epoch: 221 2023-02-10 05:15:16,163 32k INFO Train Epoch: 222 [5%] 2023-02-10 05:15:16,163 32k INFO [2.519866704940796, 2.3976869583129883, 6.224302768707275, 9.703218460083008, 0.8612720966339111, 16800, 9.727514011608789e-05] 2023-02-10 05:16:15,647 32k INFO ====> Epoch: 222 2023-02-10 05:17:41,078 32k INFO ====> Epoch: 223 2023-02-10 05:18:46,053 32k INFO Train Epoch: 224 [68%] 2023-02-10 05:18:46,053 32k INFO [2.6223254203796387, 1.9520517587661743, 4.571777820587158, 8.507990837097168, 0.792948305606842, 17000, 9.725082285098293e-05] 2023-02-10 05:18:50,420 32k INFO Saving model and optimizer state at iteration 224 to ./logs\32k\G_17000.pth 2023-02-10 05:19:09,845 32k INFO Saving model and optimizer state at iteration 224 to ./logs\32k\D_17000.pth 2023-02-10 05:19:33,204 32k INFO ====> Epoch: 224 2023-02-10 05:20:58,534 32k INFO ====> Epoch: 225 2023-02-10 05:22:23,931 32k INFO ====> Epoch: 226 2023-02-10 05:23:05,991 32k INFO Train Epoch: 227 [32%] 2023-02-10 05:23:05,991 32k INFO [2.3721673488616943, 2.349473476409912, 10.199991226196289, 15.970855712890625, 0.8750467300415039, 17200, 9.721435835085619e-05] 2023-02-10 05:23:48,977 32k INFO ====> Epoch: 227 2023-02-10 05:25:14,316 32k INFO ====> Epoch: 228 2023-02-10 05:26:35,734 32k INFO Train Epoch: 229 [95%] 2023-02-10 05:26:35,735 32k INFO [2.3928985595703125, 2.939208507537842, 17.157743453979492, 19.45735740661621, 0.6444438695907593, 17400, 9.719005628024282e-05] 2023-02-10 05:26:38,950 32k INFO ====> Epoch: 229 2023-02-10 05:28:04,250 32k INFO ====> Epoch: 230 2023-02-10 05:29:28,728 32k INFO ====> Epoch: 231 2023-02-10 05:30:28,074 32k INFO Train Epoch: 232 [58%] 2023-02-10 05:30:28,075 32k INFO [2.3620455265045166, 2.3724076747894287, 9.916801452636719, 15.793679237365723, 0.8138560652732849, 17600, 9.715361456473177e-05] 2023-02-10 05:30:54,438 32k INFO ====> Epoch: 232 2023-02-10 05:32:18,834 32k INFO ====> Epoch: 233 2023-02-10 05:33:43,436 32k INFO ====> Epoch: 234 2023-02-10 05:34:18,636 32k INFO Train Epoch: 235 [21%] 2023-02-10 05:34:18,636 32k INFO [2.421670913696289, 2.613611936569214, 10.162355422973633, 15.634803771972656, 0.8601481318473816, 17800, 9.711718651315591e-05] 2023-02-10 05:35:08,234 32k INFO ====> Epoch: 235 2023-02-10 05:36:32,671 32k INFO ====> Epoch: 236 2023-02-10 05:37:47,763 32k INFO Train Epoch: 237 [84%] 2023-02-10 05:37:47,764 32k INFO [2.3902554512023926, 2.546750783920288, 13.740601539611816, 15.754182815551758, 0.7283089756965637, 18000, 9.709290873398365e-05] 2023-02-10 05:37:52,211 32k INFO Saving model and optimizer state at iteration 237 to ./logs\32k\G_18000.pth 2023-02-10 05:38:05,355 32k INFO Saving model and optimizer state at iteration 237 to ./logs\32k\D_18000.pth 2023-02-10 05:38:18,717 32k INFO ====> Epoch: 237 2023-02-10 05:39:44,053 32k INFO ====> Epoch: 238 2023-02-10 05:41:09,330 32k INFO ====> Epoch: 239 2023-02-10 05:42:01,142 32k INFO Train Epoch: 240 [47%] 2023-02-10 05:42:01,143 32k INFO [2.312722682952881, 2.6106648445129395, 9.877479553222656, 15.368383407592773, 0.6470428109169006, 18200, 9.705650344424885e-05] 2023-02-10 05:42:34,235 32k INFO ====> Epoch: 240 2023-02-10 05:43:58,685 32k INFO ====> Epoch: 241 2023-02-10 05:45:23,999 32k INFO ====> Epoch: 242 2023-02-10 05:45:53,547 32k INFO Train Epoch: 243 [11%] 2023-02-10 05:45:53,547 32k INFO [2.3766026496887207, 2.4300293922424316, 12.744778633117676, 14.884074211120605, 0.6547593474388123, 18400, 9.702011180479129e-05] 2023-02-10 05:46:49,612 32k INFO ====> Epoch: 243 2023-02-10 05:48:15,106 32k INFO ====> Epoch: 244 2023-02-10 05:49:23,380 32k INFO Train Epoch: 245 [74%] 2023-02-10 05:49:23,380 32k INFO [2.580474376678467, 2.082796812057495, 9.039981842041016, 12.80046558380127, 0.5056825280189514, 18600, 9.699585829277933e-05] 2023-02-10 05:49:39,801 32k INFO ====> Epoch: 245 2023-02-10 05:51:04,233 32k INFO ====> Epoch: 246 2023-02-10 05:52:28,838 32k INFO ====> Epoch: 247 2023-02-10 05:53:14,110 32k INFO Train Epoch: 248 [37%] 2023-02-10 05:53:14,110 32k INFO [2.426253318786621, 2.482189416885376, 7.620111465454102, 7.9587178230285645, 0.5158562064170837, 18800, 9.695948939241093e-05] 2023-02-10 05:53:53,852 32k INFO ====> Epoch: 248 2023-02-10 05:55:18,546 32k INFO ====> Epoch: 249 2023-02-10 05:56:42,949 32k INFO ====> Epoch: 250 2023-02-10 05:57:05,033 32k INFO Train Epoch: 251 [0%] 2023-02-10 05:57:05,034 32k INFO [2.527033805847168, 2.064669370651245, 9.806694984436035, 12.288697242736816, 0.5984607934951782, 19000, 9.692313412867544e-05] 2023-02-10 05:57:09,464 32k INFO Saving model and optimizer state at iteration 251 to ./logs\32k\G_19000.pth 2023-02-10 05:57:24,997 32k INFO Saving model and optimizer state at iteration 251 to ./logs\32k\D_19000.pth 2023-02-10 05:58:31,373 32k INFO ====> Epoch: 251 2023-02-10 05:59:56,772 32k INFO ====> Epoch: 252 2023-02-10 06:00:59,380 32k INFO Train Epoch: 253 [63%] 2023-02-10 06:00:59,381 32k INFO [2.33793568611145, 2.2299203872680664, 7.934765815734863, 14.959351539611816, 0.5853436589241028, 19200, 9.689890485956725e-05] 2023-02-10 06:01:22,563 32k INFO ====> Epoch: 253 2023-02-10 06:02:47,113 32k INFO ====> Epoch: 254 2023-02-10 06:04:12,445 32k INFO ====> Epoch: 255 2023-02-10 06:04:51,929 32k INFO Train Epoch: 256 [26%] 2023-02-10 06:04:51,929 32k INFO [2.5420777797698975, 2.1400043964385986, 7.56562614440918, 11.221778869628906, 0.6662202477455139, 19400, 9.68625723121918e-05] 2023-02-10 06:05:38,131 32k INFO ====> Epoch: 256 2023-02-10 06:07:03,640 32k INFO ====> Epoch: 257 2023-02-10 06:08:22,686 32k INFO Train Epoch: 258 [89%] 2023-02-10 06:08:22,686 32k INFO [2.2824349403381348, 2.342818260192871, 11.228169441223145, 16.061767578125, 0.9010499119758606, 19600, 9.683835818259144e-05] 2023-02-10 06:08:29,233 32k INFO ====> Epoch: 258 2023-02-10 06:09:54,606 32k INFO ====> Epoch: 259 2023-02-10 06:11:19,916 32k INFO ====> Epoch: 260 2023-02-10 06:12:16,024 32k INFO Train Epoch: 261 [53%] 2023-02-10 06:12:16,024 32k INFO [2.2774453163146973, 2.862053394317627, 11.609835624694824, 18.45319366455078, 0.8751690983772278, 19800, 9.680204833738185e-05] 2023-02-10 06:12:45,792 32k INFO ====> Epoch: 261 2023-02-10 06:14:11,129 32k INFO ====> Epoch: 262 2023-02-10 06:15:35,580 32k INFO ====> Epoch: 263 2023-02-10 06:16:08,423 32k INFO Train Epoch: 264 [16%] 2023-02-10 06:16:08,423 32k INFO [2.32112455368042, 2.333401918411255, 9.849132537841797, 13.351881980895996, 0.8110747933387756, 20000, 9.676575210666227e-05] 2023-02-10 06:16:13,685 32k INFO Saving model and optimizer state at iteration 264 to ./logs\32k\G_20000.pth 2023-02-10 06:16:30,849 32k INFO Saving model and optimizer state at iteration 264 to ./logs\32k\D_20000.pth 2023-02-10 06:17:26,961 32k INFO ====> Epoch: 264 2023-02-10 06:18:52,371 32k INFO ====> Epoch: 265 2023-02-10 06:20:05,039 32k INFO Train Epoch: 266 [79%] 2023-02-10 06:20:05,039 32k INFO [2.4292171001434326, 2.405426025390625, 10.425787925720215, 15.707857131958008, 0.6626737117767334, 20200, 9.674156218060047e-05] 2023-02-10 06:20:18,267 32k INFO ====> Epoch: 266 2023-02-10 06:21:43,774 32k INFO ====> Epoch: 267 2023-02-10 06:23:08,283 32k INFO ====> Epoch: 268 2023-02-10 06:23:56,720 32k INFO Train Epoch: 269 [42%] 2023-02-10 06:23:56,720 32k INFO [2.2364232540130615, 2.6233410835266113, 12.21335506439209, 15.925817489624023, 0.2746255695819855, 20400, 9.670528862935451e-05] 2023-02-10 06:24:33,096 32k INFO ====> Epoch: 269 2023-02-10 06:25:57,635 32k INFO ====> Epoch: 270 2023-02-10 06:27:22,159 32k INFO ====> Epoch: 271 2023-02-10 06:27:47,572 32k INFO Train Epoch: 272 [5%] 2023-02-10 06:27:47,572 32k INFO [2.279311180114746, 2.69355845451355, 11.382664680480957, 15.993112564086914, 0.7590271830558777, 20600, 9.666902867899003e-05] 2023-02-10 06:28:46,950 32k INFO ====> Epoch: 272 2023-02-10 06:30:11,421 32k INFO ====> Epoch: 273 2023-02-10 06:31:16,437 32k INFO Train Epoch: 274 [68%] 2023-02-10 06:31:16,437 32k INFO [2.2094767093658447, 2.471757411956787, 11.495461463928223, 15.169349670410156, 0.6064764857292175, 20800, 9.664486293227385e-05] 2023-02-10 06:31:36,272 32k INFO ====> Epoch: 274 2023-02-10 06:33:00,711 32k INFO ====> Epoch: 275 2023-02-10 06:34:25,096 32k INFO ====> Epoch: 276 2023-02-10 06:35:06,908 32k INFO Train Epoch: 277 [32%] 2023-02-10 06:35:06,909 32k INFO [2.3740768432617188, 2.466357707977295, 10.599915504455566, 11.676641464233398, 0.681009829044342, 21000, 9.660862563871342e-05] 2023-02-10 06:35:11,320 32k INFO Saving model and optimizer state at iteration 277 to ./logs\32k\G_21000.pth 2023-02-10 06:35:28,883 32k INFO Saving model and optimizer state at iteration 277 to ./logs\32k\D_21000.pth 2023-02-10 06:36:15,635 32k INFO ====> Epoch: 277 2023-02-10 06:37:40,991 32k INFO ====> Epoch: 278 2023-02-10 06:39:03,277 32k INFO Train Epoch: 279 [95%] 2023-02-10 06:39:03,277 32k INFO [2.7325069904327393, 2.2234275341033936, 7.8317461013793945, 10.93569564819336, 0.5372100472450256, 21200, 9.658447499181352e-05] 2023-02-10 06:39:06,507 32k INFO ====> Epoch: 279 2023-02-10 06:40:31,921 32k INFO ====> Epoch: 280 2023-02-10 06:41:57,223 32k INFO ====> Epoch: 281 2023-02-10 06:42:55,547 32k INFO Train Epoch: 282 [58%] 2023-02-10 06:42:55,548 32k INFO [2.078672409057617, 3.186306953430176, 10.578144073486328, 14.06082534790039, 0.670340895652771, 21400, 9.65482603409002e-05] 2023-02-10 06:43:21,901 32k INFO ====> Epoch: 282 2023-02-10 06:44:46,353 32k INFO ====> Epoch: 283 2023-02-10 06:46:10,896 32k INFO ====> Epoch: 284 2023-02-10 06:46:46,078 32k INFO Train Epoch: 285 [21%] 2023-02-10 06:46:46,078 32k INFO [2.4770002365112305, 2.4508018493652344, 9.283089637756348, 15.895116806030273, 0.6614298224449158, 21600, 9.651205926878348e-05] 2023-02-10 06:47:35,591 32k INFO ====> Epoch: 285 2023-02-10 06:49:00,001 32k INFO ====> Epoch: 286 2023-02-10 06:50:14,886 32k INFO Train Epoch: 287 [84%] 2023-02-10 06:50:14,887 32k INFO [2.2746620178222656, 2.5889053344726562, 10.645721435546875, 13.369743347167969, 0.817758321762085, 21800, 9.64879327619672e-05] 2023-02-10 06:50:24,830 32k INFO ====> Epoch: 287 2023-02-10 06:51:50,242 32k INFO ====> Epoch: 288 2023-02-10 06:53:14,658 32k INFO ====> Epoch: 289 2023-02-10 06:54:06,355 32k INFO Train Epoch: 290 [47%] 2023-02-10 06:54:06,355 32k INFO [2.408446788787842, 2.3364787101745605, 10.12728214263916, 14.59122085571289, 0.7084507942199707, 22000, 9.645175430986486e-05] 2023-02-10 06:54:10,737 32k INFO Saving model and optimizer state at iteration 290 to ./logs\32k\G_22000.pth 2023-02-10 06:54:28,834 32k INFO Saving model and optimizer state at iteration 290 to ./logs\32k\D_22000.pth 2023-02-10 06:55:05,411 32k INFO ====> Epoch: 290 2023-02-10 06:56:30,627 32k INFO ====> Epoch: 291 2023-02-10 06:57:55,820 32k INFO ====> Epoch: 292 2023-02-10 06:58:24,516 32k INFO Train Epoch: 293 [11%] 2023-02-10 06:58:24,516 32k INFO [2.3272194862365723, 2.602001190185547, 9.372489929199219, 11.096156120300293, 0.27660641074180603, 22200, 9.641558942298625e-05] 2023-02-10 06:59:20,776 32k INFO ====> Epoch: 293 2023-02-10 07:00:45,983 32k INFO ====> Epoch: 294 2023-02-10 07:01:55,037 32k INFO Train Epoch: 295 [74%] 2023-02-10 07:01:55,037 32k INFO [2.390826463699341, 2.554483413696289, 8.697030067443848, 14.283463478088379, 0.6869368553161621, 22400, 9.639148703212408e-05] 2023-02-10 07:02:11,478 32k INFO ====> Epoch: 295 2023-02-10 07:03:36,836 32k INFO ====> Epoch: 296 2023-02-10 07:05:02,243 32k INFO ====> Epoch: 297 2023-02-10 07:05:48,212 32k INFO Train Epoch: 298 [37%] 2023-02-10 07:05:48,213 32k INFO [2.413754940032959, 2.9149270057678223, 9.995668411254883, 12.166794776916504, 0.653400182723999, 22600, 9.635534474264972e-05] 2023-02-10 07:06:27,766 32k INFO ====> Epoch: 298 2023-02-10 07:07:52,162 32k INFO ====> Epoch: 299 2023-02-10 07:09:16,689 32k INFO ====> Epoch: 300 2023-02-10 07:09:38,796 32k INFO Train Epoch: 301 [0%] 2023-02-10 07:09:38,796 32k INFO [2.2665956020355225, 2.7774581909179688, 9.704061508178711, 13.352103233337402, 0.712449312210083, 22800, 9.631921600483981e-05] 2023-02-10 07:10:41,520 32k INFO ====> Epoch: 301 2023-02-10 07:12:06,795 32k INFO ====> Epoch: 302 2023-02-10 07:13:08,471 32k INFO Train Epoch: 303 [63%] 2023-02-10 07:13:08,471 32k INFO [2.1530864238739014, 2.413203477859497, 11.726449012756348, 18.033466339111328, 0.6967122554779053, 23000, 9.629513770582634e-05] 2023-02-10 07:13:12,906 32k INFO Saving model and optimizer state at iteration 303 to ./logs\32k\G_23000.pth 2023-02-10 07:13:28,702 32k INFO Saving model and optimizer state at iteration 303 to ./logs\32k\D_23000.pth 2023-02-10 07:13:55,451 32k INFO ====> Epoch: 303 2023-02-10 07:15:20,800 32k INFO ====> Epoch: 304 2023-02-10 07:16:46,057 32k INFO ====> Epoch: 305 2023-02-10 07:17:25,604 32k INFO Train Epoch: 306 [26%] 2023-02-10 07:17:25,605 32k INFO [2.3469924926757812, 2.375382661819458, 9.387177467346191, 12.057092666625977, -0.1277291625738144, 23200, 9.625903154283315e-05] 2023-02-10 07:18:11,964 32k INFO ====> Epoch: 306 2023-02-10 07:19:36,400 32k INFO ====> Epoch: 307 2023-02-10 07:20:54,517 32k INFO Train Epoch: 308 [89%] 2023-02-10 07:20:54,517 32k INFO [2.3641486167907715, 2.28696608543396, 9.959922790527344, 17.36892318725586, 0.7711434364318848, 23400, 9.62349682889948e-05] 2023-02-10 07:21:01,068 32k INFO ====> Epoch: 308 2023-02-10 07:22:25,526 32k INFO ====> Epoch: 309 2023-02-10 07:23:50,156 32k INFO ====> Epoch: 310 2023-02-10 07:24:45,168 32k INFO Train Epoch: 311 [53%] 2023-02-10 07:24:45,169 32k INFO [2.5586600303649902, 2.3868942260742188, 8.270402908325195, 11.41346549987793, 0.6984982490539551, 23600, 9.619888468671259e-05] 2023-02-10 07:25:14,874 32k INFO ====> Epoch: 311 2023-02-10 07:26:39,243 32k INFO ====> Epoch: 312 2023-02-10 07:28:03,612 32k INFO ====> Epoch: 313 2023-02-10 07:28:35,612 32k INFO Train Epoch: 314 [16%] 2023-02-10 07:28:35,613 32k INFO [1.8997191190719604, 2.9339983463287354, 9.256068229675293, 12.056029319763184, 0.7804506421089172, 23800, 9.61628146140899e-05] 2023-02-10 07:29:28,369 32k INFO ====> Epoch: 314 2023-02-10 07:30:52,694 32k INFO ====> Epoch: 315 2023-02-10 07:32:04,287 32k INFO Train Epoch: 316 [79%] 2023-02-10 07:32:04,287 32k INFO [2.4035465717315674, 2.7094507217407227, 9.096111297607422, 14.042303085327148, 0.8556358814239502, 24000, 9.613877541298036e-05] 2023-02-10 07:32:08,663 32k INFO Saving model and optimizer state at iteration 316 to ./logs\32k\G_24000.pth 2023-02-10 07:32:24,726 32k INFO Saving model and optimizer state at iteration 316 to ./logs\32k\D_24000.pth 2023-02-10 07:32:41,400 32k INFO ====> Epoch: 316 2023-02-10 07:34:06,691 32k INFO ====> Epoch: 317