2023-02-22 00:08:12,618 32k INFO {'train': {'log_interval': 200, 'eval_interval': 1000, 'seed': 1234, 'epochs': 10000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 6, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 17920, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 384, 'port': '8001'}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 32000, 'filter_length': 1280, 'hop_length': 320, 'win_length': 1280, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': None}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [10, 8, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 256, 'ssl_dim': 256, 'n_speakers': 2}, 'spk': {'est': 0}, 'model_dir': './logs\\32k'} 2023-02-22 00:08:12,618 32k WARNING K:\AI\so-vits-svc-32k is not a git repository, therefore hash value comparison will be ignored. 2023-02-22 00:08:34,870 32k INFO Loaded checkpoint './logs\32k\G_0.pth' (iteration 1) 2023-02-22 00:08:37,960 32k INFO Loaded checkpoint './logs\32k\D_0.pth' (iteration 1) 2023-02-22 00:09:12,596 32k INFO Train Epoch: 1 [0%] 2023-02-22 00:09:12,597 32k INFO [2.145352363586426, 3.4001522064208984, 12.26406478881836, 40.874549865722656, 8.646808624267578, 0, 0.0001] 2023-02-22 00:09:19,053 32k INFO Saving model and optimizer state at iteration 1 to ./logs\32k\G_0.pth 2023-02-22 00:09:37,859 32k INFO Saving model and optimizer state at iteration 1 to ./logs\32k\D_0.pth 2023-02-22 00:11:10,285 32k INFO ====> Epoch: 1 2023-02-22 00:12:56,112 32k INFO ====> Epoch: 2 2023-02-22 00:13:21,893 32k INFO Train Epoch: 3 [4%] 2023-02-22 00:13:21,893 32k INFO [2.3486380577087402, 2.4642438888549805, 14.349164009094238, 24.486454010009766, 1.6935454607009888, 200, 9.99750015625e-05] 2023-02-22 00:14:42,277 32k INFO ====> Epoch: 3 2023-02-22 00:16:27,846 32k INFO ====> Epoch: 4 2023-02-22 00:16:56,860 32k INFO Train Epoch: 5 [8%] 2023-02-22 00:16:56,860 32k INFO [2.4858646392822266, 2.255676746368408, 11.70196533203125, 19.060020446777344, 1.3148207664489746, 400, 9.995000937421877e-05] 2023-02-22 00:18:13,793 32k INFO ====> Epoch: 5 2023-02-22 00:19:59,483 32k INFO ====> Epoch: 6 2023-02-22 00:20:31,940 32k INFO Train Epoch: 7 [12%] 2023-02-22 00:20:31,940 32k INFO [2.3629703521728516, 2.449742078781128, 9.971107482910156, 18.171764373779297, 1.2438346147537231, 600, 9.99250234335941e-05] 2023-02-22 00:21:45,560 32k INFO ====> Epoch: 7 2023-02-22 00:23:31,255 32k INFO ====> Epoch: 8 2023-02-22 00:24:07,127 32k INFO Train Epoch: 9 [16%] 2023-02-22 00:24:07,127 32k INFO [2.7334389686584473, 2.1502294540405273, 7.149779319763184, 18.11358642578125, 1.2457014322280884, 800, 9.990004373906418e-05] 2023-02-22 00:25:17,128 32k INFO ====> Epoch: 9 2023-02-22 00:27:02,598 32k INFO ====> Epoch: 10 2023-02-22 00:27:41,816 32k INFO Train Epoch: 11 [20%] 2023-02-22 00:27:41,816 32k INFO [2.525851011276245, 2.173820734024048, 10.240619659423828, 18.13275146484375, 1.0487183332443237, 1000, 9.987507028906759e-05] 2023-02-22 00:27:46,068 32k INFO Saving model and optimizer state at iteration 11 to ./logs\32k\G_1000.pth 2023-02-22 00:28:05,134 32k INFO Saving model and optimizer state at iteration 11 to ./logs\32k\D_1000.pth 2023-02-22 00:29:14,629 32k INFO ====> Epoch: 11 2023-02-22 00:31:00,159 32k INFO ====> Epoch: 12 2023-02-22 00:31:42,863 32k INFO Train Epoch: 13 [24%] 2023-02-22 00:31:42,863 32k INFO [2.4192304611206055, 2.4427719116210938, 12.2838773727417, 23.12200164794922, 1.0217249393463135, 1200, 9.98501030820433e-05] 2023-02-22 00:32:46,074 32k INFO ====> Epoch: 13 2023-02-22 00:34:31,694 32k INFO ====> Epoch: 14 2023-02-22 00:35:17,797 32k INFO Train Epoch: 15 [29%] 2023-02-22 00:35:17,798 32k INFO [2.5181407928466797, 2.393437385559082, 8.553373336791992, 21.508071899414062, 0.9627037048339844, 1400, 9.982514211643064e-05] 2023-02-22 00:36:17,539 32k INFO ====> Epoch: 15 2023-02-22 00:38:03,121 32k INFO ====> Epoch: 16 2023-02-22 00:38:52,596 32k INFO Train Epoch: 17 [33%] 2023-02-22 00:38:52,597 32k INFO [2.3772032260894775, 2.1716203689575195, 12.80620002746582, 22.022525787353516, 1.2130091190338135, 1600, 9.980018739066937e-05] 2023-02-22 00:39:48,945 32k INFO ====> Epoch: 17 2023-02-22 00:41:34,608 32k INFO ====> Epoch: 18 2023-02-22 00:42:27,609 32k INFO Train Epoch: 19 [37%] 2023-02-22 00:42:27,609 32k INFO [2.588810920715332, 2.084688901901245, 10.23686408996582, 17.19050407409668, 1.0559662580490112, 1800, 9.977523890319963e-05] 2023-02-22 00:43:20,562 32k INFO ====> Epoch: 19 2023-02-22 00:45:06,198 32k INFO ====> Epoch: 20 2023-02-22 00:46:02,554 32k INFO Train Epoch: 21 [41%] 2023-02-22 00:46:02,555 32k INFO [2.3476009368896484, 2.4465882778167725, 14.103896141052246, 22.818317413330078, 0.7795078158378601, 2000, 9.975029665246193e-05] 2023-02-22 00:46:06,816 32k INFO Saving model and optimizer state at iteration 21 to ./logs\32k\G_2000.pth 2023-02-22 00:46:24,690 32k INFO Saving model and optimizer state at iteration 21 to ./logs\32k\D_2000.pth 2023-02-22 00:47:17,669 32k INFO ====> Epoch: 21 2023-02-22 00:49:03,347 32k INFO ====> Epoch: 22 2023-02-22 00:50:03,181 32k INFO Train Epoch: 23 [45%] 2023-02-22 00:50:03,181 32k INFO [2.331178903579712, 2.411133289337158, 13.773752212524414, 21.044940948486328, 1.075492262840271, 2200, 9.972536063689719e-05] 2023-02-22 00:50:49,196 32k INFO ====> Epoch: 23 2023-02-22 00:52:35,089 32k INFO ====> Epoch: 24 2023-02-22 00:53:38,492 32k INFO Train Epoch: 25 [49%] 2023-02-22 00:53:38,492 32k INFO [2.7379379272460938, 2.1680490970611572, 10.426996231079102, 19.272850036621094, 1.2769395112991333, 2400, 9.970043085494672e-05] 2023-02-22 00:54:21,188 32k INFO ====> Epoch: 25 2023-02-22 00:56:06,784 32k INFO ====> Epoch: 26 2023-02-22 00:57:13,707 32k INFO Train Epoch: 27 [53%] 2023-02-22 00:57:13,707 32k INFO [2.4311702251434326, 2.5075151920318604, 9.771134376525879, 13.715150833129883, 0.9923850297927856, 2600, 9.967550730505221e-05] 2023-02-22 00:57:53,034 32k INFO ====> Epoch: 27 2023-02-22 00:59:38,682 32k INFO ====> Epoch: 28 2023-02-22 01:00:48,916 32k INFO Train Epoch: 29 [57%] 2023-02-22 01:00:48,916 32k INFO [2.397207736968994, 2.2835323810577393, 11.884821891784668, 21.50246810913086, 0.7772465944290161, 2800, 9.965058998565574e-05] 2023-02-22 01:01:24,697 32k INFO ====> Epoch: 29 2023-02-22 01:03:10,447 32k INFO ====> Epoch: 30 2023-02-22 01:04:23,993 32k INFO Train Epoch: 31 [61%] 2023-02-22 01:04:23,994 32k INFO [2.4665157794952393, 2.223632574081421, 10.100894927978516, 15.292652130126953, 1.0024449825286865, 3000, 9.962567889519979e-05] 2023-02-22 01:04:28,221 32k INFO Saving model and optimizer state at iteration 31 to ./logs\32k\G_3000.pth 2023-02-22 01:04:48,543 32k INFO Saving model and optimizer state at iteration 31 to ./logs\32k\D_3000.pth 2023-02-22 01:05:24,206 32k INFO ====> Epoch: 31 2023-02-22 01:07:09,674 32k INFO ====> Epoch: 32 2023-02-22 01:08:26,794 32k INFO Train Epoch: 33 [65%] 2023-02-22 01:08:26,795 32k INFO [2.4395065307617188, 2.381229877471924, 11.58455753326416, 21.415176391601562, 1.0025029182434082, 3200, 9.960077403212722e-05] 2023-02-22 01:08:55,697 32k INFO ====> Epoch: 33 2023-02-22 01:10:41,472 32k INFO ====> Epoch: 34 2023-02-22 01:12:02,104 32k INFO Train Epoch: 35 [69%] 2023-02-22 01:12:02,105 32k INFO [2.388833522796631, 2.4029440879821777, 9.948843002319336, 17.525562286376953, 0.9836262464523315, 3400, 9.957587539488128e-05] 2023-02-22 01:12:27,570 32k INFO ====> Epoch: 35 2023-02-22 01:14:13,128 32k INFO ====> Epoch: 36 2023-02-22 01:15:37,186 32k INFO Train Epoch: 37 [73%] 2023-02-22 01:15:37,187 32k INFO [2.396942615509033, 2.3084261417388916, 7.296802997589111, 13.69768238067627, 0.9189913868904114, 3600, 9.95509829819056e-05] 2023-02-22 01:15:59,261 32k INFO ====> Epoch: 37 2023-02-22 01:17:45,166 32k INFO ====> Epoch: 38 2023-02-22 01:19:12,693 32k INFO Train Epoch: 39 [78%] 2023-02-22 01:19:12,694 32k INFO [2.34450626373291, 2.3164803981781006, 13.175307273864746, 21.369274139404297, 0.5636721849441528, 3800, 9.952609679164422e-05] 2023-02-22 01:19:31,254 32k INFO ====> Epoch: 39 2023-02-22 01:21:17,022 32k INFO ====> Epoch: 40 2023-02-22 01:22:47,780 32k INFO Train Epoch: 41 [82%] 2023-02-22 01:22:47,781 32k INFO [2.6599762439727783, 2.186920642852783, 8.917391777038574, 14.608092308044434, 0.7955180406570435, 4000, 9.950121682254156e-05] 2023-02-22 01:22:52,052 32k INFO Saving model and optimizer state at iteration 41 to ./logs\32k\G_4000.pth 2023-02-22 01:23:13,030 32k INFO Saving model and optimizer state at iteration 41 to ./logs\32k\D_4000.pth 2023-02-22 01:23:31,585 32k INFO ====> Epoch: 41 2023-02-22 01:25:17,001 32k INFO ====> Epoch: 42 2023-02-22 01:26:51,140 32k INFO Train Epoch: 43 [86%] 2023-02-22 01:26:51,141 32k INFO [2.218780994415283, 2.8232128620147705, 10.92943286895752, 15.00414752960205, 0.885425865650177, 4200, 9.947634307304244e-05] 2023-02-22 01:27:02,817 32k INFO ====> Epoch: 43 2023-02-22 01:28:48,594 32k INFO ====> Epoch: 44 2023-02-22 01:30:26,338 32k INFO Train Epoch: 45 [90%] 2023-02-22 01:30:26,338 32k INFO [2.6101627349853516, 2.4599862098693848, 10.045958518981934, 18.106704711914062, 0.9387855529785156, 4400, 9.945147554159202e-05] 2023-02-22 01:30:34,593 32k INFO ====> Epoch: 45 2023-02-22 01:32:20,323 32k INFO ====> Epoch: 46 2023-02-22 01:34:01,568 32k INFO Train Epoch: 47 [94%] 2023-02-22 01:34:01,569 32k INFO [2.4062795639038086, 2.434760093688965, 9.70323371887207, 17.096527099609375, 0.6637702584266663, 4600, 9.942661422663591e-05] 2023-02-22 01:34:06,513 32k INFO ====> Epoch: 47 2023-02-22 01:35:52,323 32k INFO ====> Epoch: 48 2023-02-22 01:37:36,966 32k INFO Train Epoch: 49 [98%] 2023-02-22 01:37:36,967 32k INFO [2.4078311920166016, 2.423760175704956, 11.222769737243652, 20.222986221313477, 0.9975738525390625, 4800, 9.940175912662009e-05] 2023-02-22 01:37:38,357 32k INFO ====> Epoch: 49 2023-02-22 01:39:23,989 32k INFO ====> Epoch: 50 2023-02-22 01:41:09,562 32k INFO ====> Epoch: 51 2023-02-22 01:41:33,357 32k INFO Train Epoch: 52 [2%] 2023-02-22 01:41:33,358 32k INFO [2.568455934524536, 2.097076177597046, 8.081390380859375, 16.72420310974121, 1.0670697689056396, 5000, 9.936448812621091e-05] 2023-02-22 01:41:37,575 32k INFO Saving model and optimizer state at iteration 52 to ./logs\32k\G_5000.pth 2023-02-22 01:41:53,583 32k INFO Saving model and optimizer state at iteration 52 to ./logs\32k\D_5000.pth 2023-02-22 01:43:19,034 32k INFO ====> Epoch: 52 2023-02-22 01:45:04,531 32k INFO ====> Epoch: 53 2023-02-22 01:45:31,828 32k INFO Train Epoch: 54 [6%] 2023-02-22 01:45:31,829 32k INFO [2.367290735244751, 2.359792947769165, 12.402170181274414, 22.279163360595703, 0.5312451124191284, 5200, 9.933964855674948e-05] 2023-02-22 01:46:50,519 32k INFO ====> Epoch: 54 2023-02-22 01:48:36,114 32k INFO ====> Epoch: 55 2023-02-22 01:49:06,731 32k INFO Train Epoch: 56 [10%] 2023-02-22 01:49:06,731 32k INFO [2.4941694736480713, 2.2833311557769775, 11.451539039611816, 18.980377197265625, 1.122371792793274, 5400, 9.931481519679228e-05] 2023-02-22 01:50:22,005 32k INFO ====> Epoch: 56 2023-02-22 01:52:08,000 32k INFO ====> Epoch: 57 2023-02-22 01:52:43,577 32k INFO Train Epoch: 58 [14%] 2023-02-22 01:52:43,577 32k INFO [2.5777227878570557, 2.013221025466919, 7.400770664215088, 16.267370223999023, 0.9109489917755127, 5600, 9.928998804478705e-05] 2023-02-22 01:53:55,400 32k INFO ====> Epoch: 58 2023-02-22 01:55:41,267 32k INFO ====> Epoch: 59 2023-02-22 01:56:18,836 32k INFO Train Epoch: 60 [18%] 2023-02-22 01:56:18,837 32k INFO [2.3286633491516113, 2.273066282272339, 11.276134490966797, 19.17591667175293, 0.6657251715660095, 5800, 9.926516709918191e-05] 2023-02-22 01:57:27,294 32k INFO ====> Epoch: 60 2023-02-22 01:59:12,967 32k INFO ====> Epoch: 61 2023-02-22 01:59:53,907 32k INFO Train Epoch: 62 [22%] 2023-02-22 01:59:53,907 32k INFO [2.572532892227173, 2.336395025253296, 12.095869064331055, 18.98446273803711, 0.7074256539344788, 6000, 9.924035235842533e-05] 2023-02-22 01:59:58,246 32k INFO Saving model and optimizer state at iteration 62 to ./logs\32k\G_6000.pth 2023-02-22 02:00:15,073 32k INFO Saving model and optimizer state at iteration 62 to ./logs\32k\D_6000.pth 2023-02-22 02:01:23,134 32k INFO ====> Epoch: 62 2023-02-22 02:03:08,840 32k INFO ====> Epoch: 63 2023-02-22 02:03:53,319 32k INFO Train Epoch: 64 [27%] 2023-02-22 02:03:53,320 32k INFO [2.5473315715789795, 2.3198702335357666, 8.785454750061035, 16.218992233276367, 0.8188357949256897, 6200, 9.921554382096622e-05] 2023-02-22 02:04:54,908 32k INFO ====> Epoch: 64 2023-02-22 02:06:40,700 32k INFO ====> Epoch: 65 2023-02-22 02:07:28,567 32k INFO Train Epoch: 66 [31%] 2023-02-22 02:07:28,567 32k INFO [2.507612705230713, 2.3138248920440674, 6.625302791595459, 13.481208801269531, 0.4462297260761261, 6400, 9.919074148525384e-05] 2023-02-22 02:08:26,757 32k INFO ====> Epoch: 66 2023-02-22 02:10:12,508 32k INFO ====> Epoch: 67 2023-02-22 02:11:03,725 32k INFO Train Epoch: 68 [35%] 2023-02-22 02:11:03,725 32k INFO [2.5980536937713623, 2.279015064239502, 7.950907230377197, 13.451935768127441, 1.1401939392089844, 6600, 9.916594534973787e-05] 2023-02-22 02:11:58,445 32k INFO ====> Epoch: 68 2023-02-22 02:13:44,135 32k INFO ====> Epoch: 69 2023-02-22 02:14:38,891 32k INFO Train Epoch: 70 [39%] 2023-02-22 02:14:38,892 32k INFO [2.5319972038269043, 2.51550030708313, 10.939152717590332, 19.677751541137695, 0.63681960105896, 6800, 9.914115541286833e-05] 2023-02-22 02:15:30,175 32k INFO ====> Epoch: 70 2023-02-22 02:17:15,908 32k INFO ====> Epoch: 71 2023-02-22 02:18:14,245 32k INFO Train Epoch: 72 [43%] 2023-02-22 02:18:14,245 32k INFO [2.5482029914855957, 2.192294120788574, 9.581657409667969, 15.53216552734375, 1.2906405925750732, 7000, 9.911637167309565e-05] 2023-02-22 02:18:18,620 32k INFO Saving model and optimizer state at iteration 72 to ./logs\32k\G_7000.pth 2023-02-22 02:18:35,298 32k INFO Saving model and optimizer state at iteration 72 to ./logs\32k\D_7000.pth 2023-02-22 02:19:26,420 32k INFO ====> Epoch: 72 2023-02-22 02:21:12,370 32k INFO ====> Epoch: 73 2023-02-22 02:22:13,800 32k INFO Train Epoch: 74 [47%] 2023-02-22 02:22:13,801 32k INFO [2.5285887718200684, 2.162809371948242, 13.120260238647461, 21.690521240234375, 0.7244228720664978, 7200, 9.909159412887068e-05] 2023-02-22 02:22:58,230 32k INFO ====> Epoch: 74 2023-02-22 02:24:44,154 32k INFO ====> Epoch: 75 2023-02-22 02:25:49,172 32k INFO Train Epoch: 76 [51%] 2023-02-22 02:25:49,172 32k INFO [2.3859612941741943, 2.5308663845062256, 9.781866073608398, 18.245580673217773, 0.9907944798469543, 7400, 9.906682277864462e-05] 2023-02-22 02:26:30,190 32k INFO ====> Epoch: 76 2023-02-22 02:28:15,895 32k INFO ====> Epoch: 77 2023-02-22 02:29:24,473 32k INFO Train Epoch: 78 [55%] 2023-02-22 02:29:24,473 32k INFO [2.386472702026367, 2.3571693897247314, 11.112114906311035, 19.415468215942383, 0.8537010550498962, 7600, 9.904205762086905e-05] 2023-02-22 02:30:01,938 32k INFO ====> Epoch: 78 2023-02-22 02:31:47,642 32k INFO ====> Epoch: 79 2023-02-22 02:32:59,591 32k INFO Train Epoch: 80 [59%] 2023-02-22 02:32:59,592 32k INFO [2.4535465240478516, 2.43351411819458, 10.742626190185547, 20.20539093017578, 0.905364990234375, 7800, 9.901729865399597e-05] 2023-02-22 02:33:33,554 32k INFO ====> Epoch: 80 2023-02-22 02:35:19,249 32k INFO ====> Epoch: 81 2023-02-22 02:36:34,659 32k INFO Train Epoch: 82 [63%] 2023-02-22 02:36:34,659 32k INFO [2.5064964294433594, 2.268216848373413, 9.974581718444824, 17.910846710205078, 0.9886245727539062, 8000, 9.899254587647776e-05] 2023-02-22 02:36:38,899 32k INFO Saving model and optimizer state at iteration 82 to ./logs\32k\G_8000.pth 2023-02-22 02:36:55,742 32k INFO Saving model and optimizer state at iteration 82 to ./logs\32k\D_8000.pth 2023-02-22 02:37:29,423 32k INFO ====> Epoch: 82 2023-02-22 02:39:15,168 32k INFO ====> Epoch: 83 2023-02-22 02:40:34,013 32k INFO Train Epoch: 84 [67%] 2023-02-22 02:40:34,013 32k INFO [2.248044490814209, 2.346158266067505, 11.082698822021484, 14.95010757446289, 0.6171810626983643, 8200, 9.896779928676716e-05] 2023-02-22 02:41:01,209 32k INFO ====> Epoch: 84 2023-02-22 02:42:47,112 32k INFO ====> Epoch: 85 2023-02-22 02:44:09,353 32k INFO Train Epoch: 86 [71%] 2023-02-22 02:44:09,353 32k INFO [2.6061596870422363, 2.314436912536621, 10.66403865814209, 19.200468063354492, 1.07802414894104, 8400, 9.894305888331732e-05] 2023-02-22 02:44:33,096 32k INFO ====> Epoch: 86 2023-02-22 02:46:18,750 32k INFO ====> Epoch: 87 2023-02-22 02:47:44,525 32k INFO Train Epoch: 88 [76%] 2023-02-22 02:47:44,526 32k INFO [2.4097466468811035, 2.220452308654785, 11.689645767211914, 18.19406509399414, 0.7397383451461792, 8600, 9.891832466458178e-05] 2023-02-22 02:48:04,901 32k INFO ====> Epoch: 88 2023-02-22 02:49:50,659 32k INFO ====> Epoch: 89 2023-02-22 02:51:19,746 32k INFO Train Epoch: 90 [80%] 2023-02-22 02:51:19,746 32k INFO [2.4018168449401855, 2.383319139480591, 12.725435256958008, 18.570293426513672, 1.2559199333190918, 8800, 9.889359662901445e-05] 2023-02-22 02:51:36,642 32k INFO ====> Epoch: 90 2023-02-22 02:53:22,585 32k INFO ====> Epoch: 91 2023-02-22 02:54:55,326 32k INFO Train Epoch: 92 [84%] 2023-02-22 02:54:55,327 32k INFO [2.290731430053711, 2.6917805671691895, 11.489081382751465, 20.181421279907227, 0.704178512096405, 9000, 9.886887477506964e-05] 2023-02-22 02:54:59,569 32k INFO Saving model and optimizer state at iteration 92 to ./logs\32k\G_9000.pth 2023-02-22 02:55:15,826 32k INFO Saving model and optimizer state at iteration 92 to ./logs\32k\D_9000.pth 2023-02-22 02:55:32,580 32k INFO ====> Epoch: 92 2023-02-22 02:57:18,717 32k INFO ====> Epoch: 93 2023-02-22 02:58:55,426 32k INFO Train Epoch: 94 [88%] 2023-02-22 02:58:55,426 32k INFO [2.7609176635742188, 2.471208333969116, 5.6250128746032715, 12.03889274597168, 0.7381411790847778, 9200, 9.884415910120204e-05] 2023-02-22 02:59:05,407 32k INFO ====> Epoch: 94 2023-02-22 03:00:51,801 32k INFO ====> Epoch: 95 2023-02-22 03:02:31,827 32k INFO Train Epoch: 96 [92%] 2023-02-22 03:02:31,827 32k INFO [2.406148672103882, 2.274200677871704, 11.835630416870117, 21.682048797607422, 1.1413294076919556, 9400, 9.881944960586671e-05] 2023-02-22 03:02:38,366 32k INFO ====> Epoch: 96 2023-02-22 03:04:24,238 32k INFO ====> Epoch: 97 2023-02-22 03:06:07,737 32k INFO Train Epoch: 98 [96%] 2023-02-22 03:06:07,737 32k INFO [2.4189469814300537, 2.4262475967407227, 7.4442033767700195, 12.63837718963623, 0.5815107822418213, 9600, 9.879474628751914e-05] 2023-02-22 03:06:10,846 32k INFO ====> Epoch: 98 2023-02-22 03:07:57,269 32k INFO ====> Epoch: 99 2023-02-22 03:09:42,957 32k INFO ====> Epoch: 100 2023-02-22 03:10:05,045 32k INFO Train Epoch: 101 [0%] 2023-02-22 03:10:05,046 32k INFO [2.5269906520843506, 2.3146581649780273, 7.8620123863220215, 17.898523330688477, 0.5046048760414124, 9800, 9.875770288847208e-05] 2023-02-22 03:11:28,848 32k INFO ====> Epoch: 101 2023-02-22 03:13:14,566 32k INFO ====> Epoch: 102 2023-02-22 03:13:40,056 32k INFO Train Epoch: 103 [4%] 2023-02-22 03:13:40,057 32k INFO [2.1958470344543457, 2.465301752090454, 14.825515747070312, 22.445404052734375, 0.4519413709640503, 10000, 9.873301500583906e-05] 2023-02-22 03:13:44,302 32k INFO Saving model and optimizer state at iteration 103 to ./logs\32k\G_10000.pth 2023-02-22 03:14:01,957 32k INFO Saving model and optimizer state at iteration 103 to ./logs\32k\D_10000.pth 2023-02-22 03:15:25,803 32k INFO ====> Epoch: 103 2023-02-22 03:17:12,136 32k INFO ====> Epoch: 104 2023-02-22 03:17:41,752 32k INFO Train Epoch: 105 [8%] 2023-02-22 03:17:41,752 32k INFO [2.530261993408203, 2.3227694034576416, 11.736129760742188, 16.371919631958008, 0.7204845547676086, 10200, 9.870833329479095e-05] 2023-02-22 03:18:58,780 32k INFO ====> Epoch: 105 2023-02-22 03:20:45,072 32k INFO ====> Epoch: 106 2023-02-22 03:21:18,158 32k INFO Train Epoch: 107 [12%] 2023-02-22 03:21:18,159 32k INFO [2.4681787490844727, 2.8919472694396973, 8.824729919433594, 16.09588623046875, 0.8613712787628174, 10400, 9.868365775378495e-05] 2023-02-22 03:22:31,624 32k INFO ====> Epoch: 107 2023-02-22 03:24:17,336 32k INFO ====> Epoch: 108 2023-02-22 03:24:53,255 32k INFO Train Epoch: 109 [16%] 2023-02-22 03:24:53,256 32k INFO [2.483661413192749, 2.4146034717559814, 9.786486625671387, 18.913461685180664, 1.075326919555664, 10600, 9.865898838127865e-05] 2023-02-22 03:26:03,360 32k INFO ====> Epoch: 109 2023-02-22 03:27:49,697 32k INFO ====> Epoch: 110 2023-02-22 03:28:29,576 32k INFO Train Epoch: 111 [20%] 2023-02-22 03:28:29,577 32k INFO [2.538566827774048, 2.12762713432312, 7.408393859863281, 13.515085220336914, 0.8555375933647156, 10800, 9.863432517573002e-05] 2023-02-22 03:29:36,385 32k INFO ====> Epoch: 111 2023-02-22 03:31:22,092 32k INFO ====> Epoch: 112 2023-02-22 03:32:04,865 32k INFO Train Epoch: 113 [24%] 2023-02-22 03:32:04,866 32k INFO [2.1984810829162598, 2.4401934146881104, 14.276609420776367, 22.693824768066406, 0.9264913201332092, 11000, 9.86096681355974e-05] 2023-02-22 03:32:09,113 32k INFO Saving model and optimizer state at iteration 113 to ./logs\32k\G_11000.pth 2023-02-22 03:32:25,992 32k INFO Saving model and optimizer state at iteration 113 to ./logs\32k\D_11000.pth 2023-02-22 03:33:32,900 32k INFO ====> Epoch: 113 2023-02-22 03:35:19,193 32k INFO ====> Epoch: 114 2023-02-22 03:36:06,007 32k INFO Train Epoch: 115 [29%] 2023-02-22 03:36:06,008 32k INFO [2.3687469959259033, 2.8906643390655518, 9.432950019836426, 19.32032585144043, 0.8109574913978577, 11200, 9.858501725933955e-05] 2023-02-22 03:37:05,975 32k INFO ====> Epoch: 115 2023-02-22 03:38:52,586 32k INFO ====> Epoch: 116 2023-02-22 03:39:42,332 32k INFO Train Epoch: 117 [33%] 2023-02-22 03:39:42,333 32k INFO [2.178363561630249, 2.667116641998291, 13.681983947753906, 20.458284378051758, 0.8319520950317383, 11400, 9.85603725454156e-05] 2023-02-22 03:40:38,811 32k INFO ====> Epoch: 117 2023-02-22 03:42:24,730 32k INFO ====> Epoch: 118 2023-02-22 03:43:19,888 32k INFO Train Epoch: 119 [37%] 2023-02-22 03:43:19,889 32k INFO [2.546008586883545, 2.2007601261138916, 10.31205940246582, 18.787328720092773, 1.2723021507263184, 11600, 9.853573399228505e-05] 2023-02-22 03:44:12,958 32k INFO ====> Epoch: 119 2023-02-22 03:45:58,715 32k INFO ====> Epoch: 120 2023-02-22 03:46:55,149 32k INFO Train Epoch: 121 [41%] 2023-02-22 03:46:55,149 32k INFO [2.446214437484741, 2.449553966522217, 13.851907730102539, 21.115680694580078, 0.7339575886726379, 11800, 9.851110159840781e-05] 2023-02-22 03:47:44,775 32k INFO ====> Epoch: 121 2023-02-22 03:49:30,581 32k INFO ====> Epoch: 122 2023-02-22 03:50:30,560 32k INFO Train Epoch: 123 [45%] 2023-02-22 03:50:30,560 32k INFO [2.35149884223938, 2.75274658203125, 12.845666885375977, 18.79739761352539, 1.1703294515609741, 12000, 9.848647536224416e-05] 2023-02-22 03:50:34,776 32k INFO Saving model and optimizer state at iteration 123 to ./logs\32k\G_12000.pth 2023-02-22 03:50:53,144 32k INFO Saving model and optimizer state at iteration 123 to ./logs\32k\D_12000.pth 2023-02-22 03:51:42,693 32k INFO ====> Epoch: 123 2023-02-22 03:53:28,929 32k INFO ====> Epoch: 124 2023-02-22 03:54:32,842 32k INFO Train Epoch: 125 [49%] 2023-02-22 03:54:32,842 32k INFO [2.589078664779663, 2.1445438861846924, 11.166776657104492, 17.973941802978516, 0.7241864204406738, 12200, 9.846185528225477e-05] 2023-02-22 03:55:15,637 32k INFO ====> Epoch: 125 2023-02-22 03:57:01,184 32k INFO ====> Epoch: 126 2023-02-22 03:58:08,640 32k INFO Train Epoch: 127 [53%] 2023-02-22 03:58:08,640 32k INFO [2.5327906608581543, 2.081197500228882, 7.769866943359375, 14.17745590209961, 0.6943719983100891, 12400, 9.84372413569007e-05] 2023-02-22 03:58:47,757 32k INFO ====> Epoch: 127 2023-02-22 04:00:34,155 32k INFO ====> Epoch: 128 2023-02-22 04:01:44,324 32k INFO Train Epoch: 129 [57%] 2023-02-22 04:01:44,325 32k INFO [2.5065693855285645, 2.289240837097168, 8.885828971862793, 15.302033424377441, 1.1030964851379395, 12600, 9.841263358464336e-05] 2023-02-22 04:02:20,017 32k INFO ====> Epoch: 129 2023-02-22 04:04:05,745 32k INFO ====> Epoch: 130 2023-02-22 04:05:19,448 32k INFO Train Epoch: 131 [61%] 2023-02-22 04:05:19,448 32k INFO [2.339315414428711, 2.6337029933929443, 12.177719116210938, 20.183080673217773, 1.1095930337905884, 12800, 9.838803196394459e-05] 2023-02-22 04:05:51,845 32k INFO ====> Epoch: 131 2023-02-22 04:07:37,384 32k INFO ====> Epoch: 132 2023-02-22 04:08:54,557 32k INFO Train Epoch: 133 [65%] 2023-02-22 04:08:54,557 32k INFO [2.3692660331726074, 2.2918972969055176, 13.822396278381348, 20.760438919067383, 0.7944438457489014, 13000, 9.836343649326659e-05] 2023-02-22 04:08:58,801 32k INFO Saving model and optimizer state at iteration 133 to ./logs\32k\G_13000.pth 2023-02-22 04:09:17,244 32k INFO Saving model and optimizer state at iteration 133 to ./logs\32k\D_13000.pth 2023-02-22 04:09:49,268 32k INFO ====> Epoch: 133 2023-02-22 04:11:35,690 32k INFO ====> Epoch: 134 2023-02-22 04:12:56,725 32k INFO Train Epoch: 135 [69%] 2023-02-22 04:12:56,725 32k INFO [2.538072109222412, 2.147736072540283, 8.285018920898438, 15.206883430480957, 0.571492075920105, 13200, 9.833884717107196e-05] 2023-02-22 04:13:22,345 32k INFO ====> Epoch: 135 2023-02-22 04:15:08,661 32k INFO ====> Epoch: 136 2023-02-22 04:16:32,598 32k INFO Train Epoch: 137 [73%] 2023-02-22 04:16:32,599 32k INFO [2.5354621410369873, 2.3016741275787354, 11.92038345336914, 18.641353607177734, 0.539569616317749, 13400, 9.831426399582366e-05] 2023-02-22 04:16:54,692 32k INFO ====> Epoch: 137 2023-02-22 04:18:41,307 32k INFO ====> Epoch: 138 2023-02-22 04:20:09,435 32k INFO Train Epoch: 139 [78%] 2023-02-22 04:20:09,436 32k INFO [2.333071231842041, 2.484257221221924, 12.461816787719727, 19.274198532104492, 0.8236677050590515, 13600, 9.828968696598508e-05] 2023-02-22 04:20:28,071 32k INFO ====> Epoch: 139 2023-02-22 04:22:14,167 32k INFO ====> Epoch: 140 2023-02-22 04:23:45,593 32k INFO Train Epoch: 141 [82%] 2023-02-22 04:23:45,594 32k INFO [2.4791781902313232, 2.1751325130462646, 10.909151077270508, 14.634515762329102, 0.7781238555908203, 13800, 9.826511608001993e-05] 2023-02-22 04:24:00,820 32k INFO ====> Epoch: 141 2023-02-22 04:25:46,692 32k INFO ====> Epoch: 142 2023-02-22 04:27:20,993 32k INFO Train Epoch: 143 [86%] 2023-02-22 04:27:20,993 32k INFO [2.346167802810669, 2.548734188079834, 13.352293014526367, 18.923927307128906, 0.5950165390968323, 14000, 9.824055133639235e-05] 2023-02-22 04:27:25,245 32k INFO Saving model and optimizer state at iteration 143 to ./logs\32k\G_14000.pth 2023-02-22 04:27:45,494 32k INFO Saving model and optimizer state at iteration 143 to ./logs\32k\D_14000.pth 2023-02-22 04:28:00,501 32k INFO ====> Epoch: 143 2023-02-22 04:29:46,683 32k INFO ====> Epoch: 144 2023-02-22 04:31:25,056 32k INFO Train Epoch: 145 [90%] 2023-02-22 04:31:25,056 32k INFO [2.4927594661712646, 2.3070590496063232, 10.073404312133789, 16.529218673706055, 0.670913815498352, 14200, 9.821599273356685e-05] 2023-02-22 04:31:33,427 32k INFO ====> Epoch: 145 2023-02-22 04:33:19,691 32k INFO ====> Epoch: 146 2023-02-22 04:35:00,838 32k INFO Train Epoch: 147 [94%] 2023-02-22 04:35:00,839 32k INFO [2.4546353816986084, 2.278806686401367, 8.069828987121582, 13.182048797607422, 0.5054904222488403, 14400, 9.819144027000834e-05] 2023-02-22 04:35:05,693 32k INFO ====> Epoch: 147 2023-02-22 04:36:51,295 32k INFO ====> Epoch: 148 2023-02-22 04:38:35,846 32k INFO Train Epoch: 149 [98%] 2023-02-22 04:38:35,847 32k INFO [2.5683984756469727, 2.2057595252990723, 10.475971221923828, 16.20850944519043, 0.9734923839569092, 14600, 9.816689394418209e-05] 2023-02-22 04:38:37,247 32k INFO ====> Epoch: 149 2023-02-22 04:40:25,138 32k INFO ====> Epoch: 150 2023-02-22 04:42:11,207 32k INFO ====> Epoch: 151 2023-02-22 04:42:35,029 32k INFO Train Epoch: 152 [2%] 2023-02-22 04:42:35,030 32k INFO [2.4951038360595703, 2.260463237762451, 11.500944137573242, 17.902042388916016, 0.9977881908416748, 14800, 9.813008596033443e-05] 2023-02-22 04:43:57,118 32k INFO ====> Epoch: 152 2023-02-22 04:45:42,690 32k INFO ====> Epoch: 153 2023-02-22 04:46:10,019 32k INFO Train Epoch: 154 [6%] 2023-02-22 04:46:10,019 32k INFO [2.4445221424102783, 2.4485650062561035, 12.310842514038086, 20.25748634338379, 0.7647718191146851, 15000, 9.810555497212693e-05] 2023-02-22 04:46:14,290 32k INFO Saving model and optimizer state at iteration 154 to ./logs\32k\G_15000.pth 2023-02-22 04:46:31,659 32k INFO Saving model and optimizer state at iteration 154 to ./logs\32k\D_15000.pth 2023-02-22 04:47:53,918 32k INFO ====> Epoch: 154 2023-02-22 04:49:40,479 32k INFO ====> Epoch: 155 2023-02-22 04:50:11,813 32k INFO Train Epoch: 156 [10%] 2023-02-22 04:50:11,813 32k INFO [2.5242230892181396, 2.2266323566436768, 12.2184419631958, 17.383615493774414, 1.0418349504470825, 15200, 9.808103011628319e-05] 2023-02-22 04:51:27,234 32k INFO ====> Epoch: 156 2023-02-22 04:53:13,568 32k INFO ====> Epoch: 157 2023-02-22 04:53:47,515 32k INFO Train Epoch: 158 [14%] 2023-02-22 04:53:47,516 32k INFO [2.5447096824645996, 2.3157331943511963, 8.80300521850586, 14.484112739562988, 0.7142147421836853, 15400, 9.80565113912702e-05] 2023-02-22 04:54:59,247 32k INFO ====> Epoch: 158 2023-02-22 04:56:44,971 32k INFO ====> Epoch: 159 2023-02-22 04:57:22,561 32k INFO Train Epoch: 160 [18%] 2023-02-22 04:57:22,561 32k INFO [2.3957338333129883, 2.372086524963379, 8.605619430541992, 15.040207862854004, 0.9432356357574463, 15600, 9.803199879555537e-05] 2023-02-22 04:58:30,960 32k INFO ====> Epoch: 160 2023-02-22 05:00:16,754 32k INFO ====> Epoch: 161 2023-02-22 05:00:57,776 32k INFO Train Epoch: 162 [22%] 2023-02-22 05:00:57,776 32k INFO [2.501303195953369, 2.3233895301818848, 9.497570991516113, 15.891914367675781, 0.9494279026985168, 15800, 9.800749232760646e-05] 2023-02-22 05:02:02,727 32k INFO ====> Epoch: 162 2023-02-22 05:03:48,404 32k INFO ====> Epoch: 163 2023-02-22 05:04:32,896 32k INFO Train Epoch: 164 [27%] 2023-02-22 05:04:32,896 32k INFO [2.362159013748169, 2.503201484680176, 10.656201362609863, 16.783613204956055, 0.9532788395881653, 16000, 9.798299198589162e-05] 2023-02-22 05:04:37,788 32k INFO Saving model and optimizer state at iteration 164 to ./logs\32k\G_16000.pth 2023-02-22 05:04:56,829 32k INFO Saving model and optimizer state at iteration 164 to ./logs\32k\D_16000.pth 2023-02-22 05:06:01,710 32k INFO ====> Epoch: 164 2023-02-22 05:07:48,176 32k INFO ====> Epoch: 165 2023-02-22 05:08:36,653 32k INFO Train Epoch: 166 [31%] 2023-02-22 05:08:36,653 32k INFO [2.57973575592041, 1.983064889907837, 8.993478775024414, 16.000688552856445, 0.700424075126648, 16200, 9.795849776887939e-05] 2023-02-22 05:09:34,858 32k INFO ====> Epoch: 166 2023-02-22 05:11:21,054 32k INFO ====> Epoch: 167 2023-02-22 05:12:13,078 32k INFO Train Epoch: 168 [35%] 2023-02-22 05:12:13,079 32k INFO [2.464907646179199, 2.1761608123779297, 8.09338665008545, 13.37960433959961, 0.97138512134552, 16400, 9.79340096750387e-05] 2023-02-22 05:13:07,694 32k INFO ====> Epoch: 168 2023-02-22 05:14:53,372 32k INFO ====> Epoch: 169 2023-02-22 05:15:48,194 32k INFO Train Epoch: 170 [39%] 2023-02-22 05:15:48,195 32k INFO [2.4897146224975586, 2.3846330642700195, 9.467925071716309, 17.97328758239746, 0.2975856363773346, 16600, 9.790952770283884e-05] 2023-02-22 05:16:39,376 32k INFO ====> Epoch: 170 2023-02-22 05:18:24,901 32k INFO ====> Epoch: 171 2023-02-22 05:19:22,981 32k INFO Train Epoch: 172 [43%] 2023-02-22 05:19:22,981 32k INFO [2.409517288208008, 2.347320795059204, 13.86928939819336, 19.764402389526367, 0.5637670159339905, 16800, 9.78850518507495e-05] 2023-02-22 05:20:10,773 32k INFO ====> Epoch: 172 2023-02-22 05:21:56,270 32k INFO ====> Epoch: 173 2023-02-22 05:22:57,877 32k INFO Train Epoch: 174 [47%] 2023-02-22 05:22:57,877 32k INFO [2.4215376377105713, 2.1729211807250977, 11.39505386352539, 18.73835563659668, 0.5306751728057861, 17000, 9.786058211724074e-05] 2023-02-22 05:23:02,291 32k INFO Saving model and optimizer state at iteration 174 to ./logs\32k\G_17000.pth 2023-02-22 05:23:19,621 32k INFO Saving model and optimizer state at iteration 174 to ./logs\32k\D_17000.pth 2023-02-22 05:24:07,507 32k INFO ====> Epoch: 174 2023-02-22 05:25:53,899 32k INFO ====> Epoch: 175 2023-02-22 05:26:59,648 32k INFO Train Epoch: 176 [51%] 2023-02-22 05:26:59,648 32k INFO [2.41857647895813, 2.1925084590911865, 10.57753849029541, 18.43206214904785, 0.6052053570747375, 17200, 9.783611850078301e-05] 2023-02-22 05:27:40,697 32k INFO ====> Epoch: 176 2023-02-22 05:29:27,079 32k INFO ====> Epoch: 177 2023-02-22 05:30:35,514 32k INFO Train Epoch: 178 [55%] 2023-02-22 05:30:35,514 32k INFO [2.3244147300720215, 2.360485315322876, 14.300114631652832, 19.833555221557617, 0.5028417110443115, 17400, 9.781166099984716e-05] 2023-02-22 05:31:12,982 32k INFO ====> Epoch: 178 2023-02-22 05:33:00,756 32k INFO ====> Epoch: 179 2023-02-22 05:34:14,763 32k INFO Train Epoch: 180 [59%] 2023-02-22 05:34:14,763 32k INFO [2.428579807281494, 2.262065887451172, 10.987186431884766, 19.099437713623047, 0.6582584381103516, 17600, 9.778720961290439e-05] 2023-02-22 05:34:48,821 32k INFO ====> Epoch: 180 2023-02-22 05:36:34,639 32k INFO ====> Epoch: 181 2023-02-22 05:37:49,982 32k INFO Train Epoch: 182 [63%] 2023-02-22 05:37:49,983 32k INFO [2.61875581741333, 1.9758424758911133, 9.652741432189941, 14.875738143920898, 0.9741345643997192, 17800, 9.776276433842631e-05] 2023-02-22 05:38:20,692 32k INFO ====> Epoch: 182 2023-02-22 05:40:06,124 32k INFO ====> Epoch: 183 2023-02-22 05:41:24,889 32k INFO Train Epoch: 184 [67%] 2023-02-22 05:41:24,890 32k INFO [2.540966033935547, 2.3001551628112793, 9.484389305114746, 17.637712478637695, 0.7832726240158081, 18000, 9.773832517488488e-05] 2023-02-22 05:41:29,138 32k INFO Saving model and optimizer state at iteration 184 to ./logs\32k\G_18000.pth 2023-02-22 05:41:45,744 32k INFO Saving model and optimizer state at iteration 184 to ./logs\32k\D_18000.pth 2023-02-22 05:42:16,267 32k INFO ====> Epoch: 184 2023-02-22 05:44:02,354 32k INFO ====> Epoch: 185 2023-02-22 05:45:25,035 32k INFO Train Epoch: 186 [71%] 2023-02-22 05:45:25,035 32k INFO [2.5191123485565186, 2.5359508991241455, 11.501497268676758, 19.635244369506836, 0.6616455912590027, 18200, 9.771389212075249e-05] 2023-02-22 05:45:48,797 32k INFO ====> Epoch: 186 2023-02-22 05:47:34,913 32k INFO ====> Epoch: 187 2023-02-22 05:49:00,561 32k INFO Train Epoch: 188 [76%] 2023-02-22 05:49:00,562 32k INFO [2.280853748321533, 2.274311065673828, 13.668123245239258, 18.913005828857422, 1.0380589962005615, 18400, 9.768946517450186e-05] 2023-02-22 05:49:20,932 32k INFO ====> Epoch: 188 2023-02-22 05:51:06,481 32k INFO ====> Epoch: 189 2023-02-22 05:52:35,577 32k INFO Train Epoch: 190 [80%] 2023-02-22 05:52:35,577 32k INFO [2.2568392753601074, 2.6230015754699707, 11.525728225708008, 18.298168182373047, 0.6746451258659363, 18600, 9.766504433460612e-05] 2023-02-22 05:52:52,435 32k INFO ====> Epoch: 190 2023-02-22 05:54:38,241 32k INFO ====> Epoch: 191 2023-02-22 05:56:10,787 32k INFO Train Epoch: 192 [84%] 2023-02-22 05:56:10,788 32k INFO [2.5578360557556152, 2.209872007369995, 8.583542823791504, 16.00850486755371, 1.0789265632629395, 18800, 9.764062959953878e-05] 2023-02-22 05:56:24,223 32k INFO ====> Epoch: 192 2023-02-22 05:58:09,905 32k INFO ====> Epoch: 193 2023-02-22 05:59:45,738 32k INFO Train Epoch: 194 [88%] 2023-02-22 05:59:45,738 32k INFO [2.387702703475952, 2.4445672035217285, 13.355128288269043, 16.8066463470459, 1.147564172744751, 19000, 9.761622096777372e-05] 2023-02-22 05:59:50,090 32k INFO Saving model and optimizer state at iteration 194 to ./logs\32k\G_19000.pth 2023-02-22 06:00:07,465 32k INFO Saving model and optimizer state at iteration 194 to ./logs\32k\D_19000.pth 2023-02-22 06:00:21,346 32k INFO ====> Epoch: 194 2023-02-22 06:02:07,550 32k INFO ====> Epoch: 195 2023-02-22 06:03:47,560 32k INFO Train Epoch: 196 [92%] 2023-02-22 06:03:47,561 32k INFO [2.5213208198547363, 2.257534980773926, 12.258462905883789, 19.621543884277344, 0.6611289978027344, 19200, 9.759181843778522e-05] 2023-02-22 06:03:54,101 32k INFO ====> Epoch: 196 2023-02-22 06:05:40,426 32k INFO ====> Epoch: 197 2023-02-22 06:07:23,844 32k INFO Train Epoch: 198 [96%] 2023-02-22 06:07:23,845 32k INFO [2.470740795135498, 2.21482253074646, 9.043266296386719, 14.386473655700684, 0.9298809766769409, 19400, 9.756742200804793e-05] 2023-02-22 06:07:26,950 32k INFO ====> Epoch: 198 2023-02-22 06:09:12,577 32k INFO ====> Epoch: 199 2023-02-22 06:10:58,804 32k INFO ====> Epoch: 200 2023-02-22 06:11:21,552 32k INFO Train Epoch: 201 [0%] 2023-02-22 06:11:21,553 32k INFO [2.2393460273742676, 2.660759210586548, 12.342625617980957, 18.233539581298828, 0.6276586651802063, 19600, 9.753083879807726e-05] 2023-02-22 06:12:45,383 32k INFO ====> Epoch: 201 2023-02-22 06:14:31,637 32k INFO ====> Epoch: 202 2023-02-22 06:14:57,801 32k INFO Train Epoch: 203 [4%] 2023-02-22 06:14:57,801 32k INFO [2.072805166244507, 2.761627435684204, 14.632217407226562, 19.264434814453125, 0.6756956577301025, 19800, 9.750645761229709e-05] 2023-02-22 06:16:18,193 32k INFO ====> Epoch: 203 2023-02-22 06:18:03,843 32k INFO ====> Epoch: 204 2023-02-22 06:18:32,695 32k INFO Train Epoch: 205 [8%] 2023-02-22 06:18:32,695 32k INFO [2.238548994064331, 2.5923948287963867, 13.83581829071045, 18.70966148376465, 0.808910071849823, 20000, 9.748208252143241e-05] 2023-02-22 06:18:36,991 32k INFO Saving model and optimizer state at iteration 205 to ./logs\32k\G_20000.pth 2023-02-22 06:18:52,780 32k INFO Saving model and optimizer state at iteration 205 to ./logs\32k\D_20000.pth 2023-02-22 06:20:13,632 32k INFO ====> Epoch: 205 2023-02-22 06:21:59,799 32k INFO ====> Epoch: 206 2023-02-22 06:22:32,683 32k INFO Train Epoch: 207 [12%] 2023-02-22 06:22:32,684 32k INFO [2.494340181350708, 2.35404372215271, 10.402792930603027, 18.646995544433594, 1.194170594215393, 20200, 9.745771352395957e-05] 2023-02-22 06:23:46,237 32k INFO ====> Epoch: 207 2023-02-22 06:25:31,755 32k INFO ====> Epoch: 208 2023-02-22 06:26:07,498 32k INFO Train Epoch: 209 [16%] 2023-02-22 06:26:07,499 32k INFO [2.5353317260742188, 2.2308082580566406, 10.610563278198242, 16.8080997467041, 0.9148491621017456, 20400, 9.743335061835535e-05] 2023-02-22 06:27:17,681 32k INFO ====> Epoch: 209 2023-02-22 06:29:03,252 32k INFO ====> Epoch: 210 2023-02-22 06:29:42,394 32k INFO Train Epoch: 211 [20%] 2023-02-22 06:29:42,395 32k INFO [2.6447272300720215, 1.964892864227295, 8.727736473083496, 12.838239669799805, 0.6997866630554199, 20600, 9.740899380309685e-05] 2023-02-22 06:30:49,054 32k INFO ====> Epoch: 211 2023-02-22 06:32:35,297 32k INFO ====> Epoch: 212 2023-02-22 06:33:18,576 32k INFO Train Epoch: 213 [24%] 2023-02-22 06:33:18,577 32k INFO [2.2379984855651855, 2.403615713119507, 15.184333801269531, 22.139081954956055, 0.9633049368858337, 20800, 9.73846430766616e-05] 2023-02-22 06:34:21,815 32k INFO ====> Epoch: 213 2023-02-22 06:36:08,032 32k INFO ====> Epoch: 214 2023-02-22 06:36:54,126 32k INFO Train Epoch: 215 [29%] 2023-02-22 06:36:54,127 32k INFO [2.6470189094543457, 2.246066093444824, 8.519152641296387, 15.820573806762695, 1.0386731624603271, 21000, 9.736029843752747e-05] 2023-02-22 06:36:58,320 32k INFO Saving model and optimizer state at iteration 215 to ./logs\32k\G_21000.pth 2023-02-22 06:37:15,347 32k INFO Saving model and optimizer state at iteration 215 to ./logs\32k\D_21000.pth 2023-02-22 06:38:18,371 32k INFO ====> Epoch: 215 2023-02-22 06:40:04,492 32k INFO ====> Epoch: 216 2023-02-22 06:40:54,670 32k INFO Train Epoch: 217 [33%] 2023-02-22 06:40:54,670 32k INFO [2.090405225753784, 2.5237374305725098, 15.738249778747559, 21.418203353881836, 1.2358826398849487, 21200, 9.733595988417275e-05] 2023-02-22 06:41:51,113 32k INFO ====> Epoch: 217 2023-02-22 06:43:37,251 32k INFO ====> Epoch: 218 2023-02-22 06:44:30,781 32k INFO Train Epoch: 219 [37%] 2023-02-22 06:44:30,781 32k INFO [2.5889697074890137, 2.2686333656311035, 8.668608665466309, 14.998424530029297, 1.0810593366622925, 21400, 9.731162741507607e-05] 2023-02-22 06:45:23,689 32k INFO ====> Epoch: 219 2023-02-22 06:47:09,292 32k INFO ====> Epoch: 220 2023-02-22 06:48:06,268 32k INFO Train Epoch: 221 [41%] 2023-02-22 06:48:06,269 32k INFO [2.2671685218811035, 2.67838978767395, 14.996197700500488, 21.1551570892334, 0.7494913339614868, 21600, 9.728730102871649e-05] 2023-02-22 06:48:55,957 32k INFO ====> Epoch: 221 2023-02-22 06:50:42,311 32k INFO ====> Epoch: 222 2023-02-22 06:51:42,046 32k INFO Train Epoch: 223 [45%] 2023-02-22 06:51:42,046 32k INFO [2.547105312347412, 2.134889841079712, 9.186394691467285, 14.266695022583008, 0.6058226823806763, 21800, 9.726298072357337e-05] 2023-02-22 06:52:28,072 32k INFO ====> Epoch: 223 2023-02-22 06:54:14,252 32k INFO ====> Epoch: 224 2023-02-22 06:55:18,050 32k INFO Train Epoch: 225 [49%] 2023-02-22 06:55:18,050 32k INFO [2.3165359497070312, 2.48864483833313, 13.522871971130371, 20.89539337158203, 0.46696582436561584, 22000, 9.723866649812655e-05] 2023-02-22 06:55:22,919 32k INFO Saving model and optimizer state at iteration 225 to ./logs\32k\G_22000.pth 2023-02-22 06:55:42,032 32k INFO Saving model and optimizer state at iteration 225 to ./logs\32k\D_22000.pth 2023-02-22 06:56:28,028 32k INFO ====> Epoch: 225 2023-02-22 06:58:14,309 32k INFO ====> Epoch: 226 2023-02-22 06:59:21,533 32k INFO Train Epoch: 227 [53%] 2023-02-22 06:59:21,534 32k INFO [2.635965347290039, 2.3602662086486816, 8.628167152404785, 12.540104866027832, 0.9171032309532166, 22200, 9.721435835085619e-05] 2023-02-22 07:00:00,662 32k INFO ====> Epoch: 227 2023-02-22 07:01:46,728 32k INFO ====> Epoch: 228 2023-02-22 07:02:57,519 32k INFO Train Epoch: 229 [57%] 2023-02-22 07:02:57,519 32k INFO [2.310483932495117, 2.573824405670166, 13.390860557556152, 19.26698112487793, 0.859022855758667, 22400, 9.719005628024282e-05] 2023-02-22 07:03:33,283 32k INFO ====> Epoch: 229 2023-02-22 07:05:18,825 32k INFO ====> Epoch: 230 2023-02-22 07:06:32,406 32k INFO Train Epoch: 231 [61%] 2023-02-22 07:06:32,406 32k INFO [2.4883522987365723, 2.472439765930176, 11.817012786865234, 18.11069107055664, 0.8754430413246155, 22600, 9.716576028476738e-05] 2023-02-22 07:07:04,836 32k INFO ====> Epoch: 231 2023-02-22 07:08:50,484 32k INFO ====> Epoch: 232 2023-02-22 07:10:07,417 32k INFO Train Epoch: 233 [65%] 2023-02-22 07:10:07,418 32k INFO [2.2918412685394287, 2.6064612865448, 12.61550235748291, 19.839508056640625, 0.7568845748901367, 22800, 9.714147036291117e-05] 2023-02-22 07:10:36,292 32k INFO ====> Epoch: 233 2023-02-22 07:12:21,934 32k INFO ====> Epoch: 234 2023-02-22 07:13:42,479 32k INFO Train Epoch: 235 [69%] 2023-02-22 07:13:42,479 32k INFO [2.367769479751587, 2.290179967880249, 12.814183235168457, 18.604658126831055, 0.7774146199226379, 23000, 9.711718651315591e-05] 2023-02-22 07:13:46,725 32k INFO Saving model and optimizer state at iteration 235 to ./logs\32k\G_23000.pth 2023-02-22 07:14:02,024 32k INFO Saving model and optimizer state at iteration 235 to ./logs\32k\D_23000.pth 2023-02-22 07:14:31,102 32k INFO ====> Epoch: 235 2023-02-22 07:16:17,362 32k INFO ====> Epoch: 236 2023-02-22 07:17:41,855 32k INFO Train Epoch: 237 [73%] 2023-02-22 07:17:41,855 32k INFO [2.2795772552490234, 2.5844802856445312, 11.45567512512207, 18.593366622924805, 0.8924825191497803, 23200, 9.709290873398365e-05] 2023-02-22 07:18:03,931 32k INFO ====> Epoch: 237 2023-02-22 07:19:50,029 32k INFO ====> Epoch: 238 2023-02-22 07:21:17,892 32k INFO Train Epoch: 239 [78%] 2023-02-22 07:21:17,893 32k INFO [2.455188274383545, 2.681325912475586, 12.097657203674316, 20.17988395690918, 1.3684245347976685, 23400, 9.706863702387684e-05] 2023-02-22 07:21:36,551 32k INFO ====> Epoch: 239 2023-02-22 07:23:24,284 32k INFO ====> Epoch: 240 2023-02-22 07:24:57,015 32k INFO Train Epoch: 241 [82%] 2023-02-22 07:24:57,016 32k INFO [2.629354953765869, 2.118459701538086, 10.275418281555176, 14.83414363861084, 0.5261375904083252, 23600, 9.704437138131832e-05] 2023-02-22 07:25:12,296 32k INFO ====> Epoch: 241 2023-02-22 07:26:57,752 32k INFO ====> Epoch: 242 2023-02-22 07:28:32,000 32k INFO Train Epoch: 243 [86%] 2023-02-22 07:28:32,001 32k INFO [2.329456329345703, 2.3450889587402344, 12.364802360534668, 16.953144073486328, 0.7070972919464111, 23800, 9.702011180479129e-05] 2023-02-22 07:28:43,709 32k INFO ====> Epoch: 243 2023-02-22 07:30:29,209 32k INFO ====> Epoch: 244 2023-02-22 07:32:06,746 32k INFO Train Epoch: 245 [90%] 2023-02-22 07:32:06,747 32k INFO [2.4288482666015625, 2.3079452514648438, 14.041245460510254, 19.745018005371094, 0.862689733505249, 24000, 9.699585829277933e-05] 2023-02-22 07:32:10,956 32k INFO Saving model and optimizer state at iteration 245 to ./logs\32k\G_24000.pth 2023-02-22 07:32:29,071 32k INFO Saving model and optimizer state at iteration 245 to ./logs\32k\D_24000.pth 2023-02-22 07:32:40,767 32k INFO ====> Epoch: 245 2023-02-22 07:34:26,808 32k INFO ====> Epoch: 246 2023-02-22 07:36:08,402 32k INFO Train Epoch: 247 [94%] 2023-02-22 07:36:08,403 32k INFO [2.373241901397705, 2.098410129547119, 10.478768348693848, 16.175146102905273, 0.6502178311347961, 24200, 9.69716108437664e-05] 2023-02-22 07:36:13,231 32k INFO ====> Epoch: 247 2023-02-22 07:37:59,442 32k INFO ====> Epoch: 248 2023-02-22 07:39:43,876 32k INFO Train Epoch: 249 [98%] 2023-02-22 07:39:43,877 32k INFO [2.40415096282959, 2.40627121925354, 10.043416976928711, 15.431482315063477, 0.813330888748169, 24400, 9.694736945623688e-05] 2023-02-22 07:39:45,264 32k INFO ====> Epoch: 249 2023-02-22 07:41:31,487 32k INFO ====> Epoch: 250 2023-02-22 07:43:17,279 32k INFO ====> Epoch: 251 2023-02-22 07:43:40,973 32k INFO Train Epoch: 252 [2%] 2023-02-22 07:43:40,973 32k INFO [2.2679550647735596, 2.6712074279785156, 11.691126823425293, 17.59365463256836, 0.4011988639831543, 24600, 9.691101873690936e-05] 2023-02-22 07:45:03,101 32k INFO ====> Epoch: 252 2023-02-22 07:46:48,461 32k INFO ====> Epoch: 253 2023-02-22 07:47:15,562 32k INFO Train Epoch: 254 [6%] 2023-02-22 07:47:15,562 32k INFO [2.176159381866455, 2.6822993755340576, 14.428181648254395, 21.809728622436523, 0.8452849984169006, 24800, 9.68867924964598e-05] 2023-02-22 07:48:34,159 32k INFO ====> Epoch: 254 2023-02-22 07:50:19,905 32k INFO ====> Epoch: 255 2023-02-22 07:50:51,111 32k INFO Train Epoch: 256 [10%] 2023-02-22 07:50:51,111 32k INFO [2.5852203369140625, 2.461092710494995, 10.04835319519043, 14.898118019104004, 0.956852376461029, 25000, 9.68625723121918e-05] 2023-02-22 07:50:55,927 32k INFO Saving model and optimizer state at iteration 256 to ./logs\32k\G_25000.pth 2023-02-22 07:51:13,413 32k INFO Saving model and optimizer state at iteration 256 to ./logs\32k\D_25000.pth 2023-02-22 07:52:32,056 32k INFO ====> Epoch: 256 2023-02-22 07:54:18,097 32k INFO ====> Epoch: 257 2023-02-22 07:54:52,734 32k INFO Train Epoch: 258 [14%] 2023-02-22 07:54:52,734 32k INFO [2.432253837585449, 2.456174850463867, 9.509150505065918, 16.501110076904297, 0.4222906529903412, 25200, 9.683835818259144e-05] 2023-02-22 07:56:04,434 32k INFO ====> Epoch: 258 2023-02-22 07:57:50,485 32k INFO ====> Epoch: 259 2023-02-22 07:58:27,976 32k INFO Train Epoch: 260 [18%] 2023-02-22 07:58:27,977 32k INFO [2.4247806072235107, 2.345371723175049, 8.857375144958496, 15.63666820526123, 0.7407233119010925, 25400, 9.681415010614512e-05] 2023-02-22 07:59:36,335 32k INFO ====> Epoch: 260 2023-02-22 08:01:21,939 32k INFO ====> Epoch: 261 2023-02-22 08:02:02,960 32k INFO Train Epoch: 262 [22%] 2023-02-22 08:02:02,960 32k INFO [2.6320114135742188, 2.4106760025024414, 8.685118675231934, 12.927850723266602, 0.9680966138839722, 25600, 9.678994808133967e-05] 2023-02-22 08:03:07,846 32k INFO ====> Epoch: 262 2023-02-22 08:04:53,437 32k INFO ====> Epoch: 263 2023-02-22 08:05:37,784 32k INFO Train Epoch: 264 [27%] 2023-02-22 08:05:37,784 32k INFO [2.500497341156006, 2.2675929069519043, 8.80232048034668, 15.311875343322754, 0.8597989678382874, 25800, 9.676575210666227e-05] 2023-02-22 08:06:39,268 32k INFO ====> Epoch: 264 2023-02-22 08:08:24,850 32k INFO ====> Epoch: 265 2023-02-22 08:09:12,688 32k INFO Train Epoch: 266 [31%] 2023-02-22 08:09:12,688 32k INFO [2.3594794273376465, 2.2052268981933594, 10.455811500549316, 16.885915756225586, 0.6565972566604614, 26000, 9.674156218060047e-05] 2023-02-22 08:09:17,007 32k INFO Saving model and optimizer state at iteration 266 to ./logs\32k\G_26000.pth 2023-02-22 08:09:29,919 32k INFO Saving model and optimizer state at iteration 266 to ./logs\32k\D_26000.pth 2023-02-22 08:10:31,024 32k INFO ====> Epoch: 266 2023-02-22 08:12:17,253 32k INFO ====> Epoch: 267 2023-02-22 08:13:09,136 32k INFO Train Epoch: 268 [35%] 2023-02-22 08:13:09,137 32k INFO [2.6458592414855957, 2.1084747314453125, 10.409239768981934, 15.455072402954102, 0.5523946285247803, 26200, 9.671737830164223e-05] 2023-02-22 08:14:03,759 32k INFO ====> Epoch: 268 2023-02-22 08:15:49,415 32k INFO ====> Epoch: 269 2023-02-22 08:16:44,117 32k INFO Train Epoch: 270 [39%] 2023-02-22 08:16:44,118 32k INFO [2.1286776065826416, 2.643979549407959, 11.154251098632812, 16.767776489257812, 0.3980034291744232, 26400, 9.669320046827584e-05] 2023-02-22 08:17:35,389 32k INFO ====> Epoch: 270 2023-02-22 08:19:21,150 32k INFO ====> Epoch: 271 2023-02-22 08:20:19,919 32k INFO Train Epoch: 272 [43%] 2023-02-22 08:20:19,919 32k INFO [2.36246919631958, 2.305928945541382, 13.173051834106445, 20.20313262939453, 0.4515596628189087, 26600, 9.666902867899003e-05] 2023-02-22 08:21:07,775 32k INFO ====> Epoch: 272 2023-02-22 08:22:54,059 32k INFO ====> Epoch: 273 2023-02-22 08:23:55,545 32k INFO Train Epoch: 274 [47%] 2023-02-22 08:23:55,545 32k INFO [2.1859402656555176, 2.502265214920044, 10.459367752075195, 15.742925643920898, 0.7773081660270691, 26800, 9.664486293227385e-05] 2023-02-22 08:24:39,934 32k INFO ====> Epoch: 274 2023-02-22 08:26:25,586 32k INFO ====> Epoch: 275 2023-02-22 08:27:30,547 32k INFO Train Epoch: 276 [51%] 2023-02-22 08:27:30,548 32k INFO [2.3840584754943848, 2.286221742630005, 10.370285034179688, 17.517276763916016, 0.36077797412872314, 27000, 9.662070322661676e-05] 2023-02-22 08:27:34,765 32k INFO Saving model and optimizer state at iteration 276 to ./logs\32k\G_27000.pth 2023-02-22 08:27:49,261 32k INFO Saving model and optimizer state at iteration 276 to ./logs\32k\D_27000.pth 2023-02-22 08:28:33,782 32k INFO ====> Epoch: 276 2023-02-22 08:30:20,077 32k INFO ====> Epoch: 277 2023-02-22 08:31:29,161 32k INFO Train Epoch: 278 [55%] 2023-02-22 08:31:29,161 32k INFO [2.207501173019409, 2.513786792755127, 13.21751880645752, 19.042682647705078, 0.7013548612594604, 27200, 9.659654956050859e-05] 2023-02-22 08:32:06,611 32k INFO ====> Epoch: 278 2023-02-22 08:33:52,136 32k INFO ====> Epoch: 279 2023-02-22 08:35:04,124 32k INFO Train Epoch: 280 [59%] 2023-02-22 08:35:04,124 32k INFO [2.2693753242492676, 2.6720073223114014, 12.635223388671875, 20.047088623046875, 1.1133509874343872, 27400, 9.657240193243954e-05] 2023-02-22 08:35:38,250 32k INFO ====> Epoch: 280 2023-02-22 08:37:23,913 32k INFO ====> Epoch: 281 2023-02-22 08:38:39,215 32k INFO Train Epoch: 282 [63%] 2023-02-22 08:38:39,216 32k INFO [2.389529228210449, 2.2568767070770264, 10.048675537109375, 14.323448181152344, 0.7161829471588135, 27600, 9.65482603409002e-05] 2023-02-22 08:39:09,969 32k INFO ====> Epoch: 282 2023-02-22 08:40:56,377 32k INFO ====> Epoch: 283 2023-02-22 08:42:15,609 32k INFO Train Epoch: 284 [67%] 2023-02-22 08:42:15,609 32k INFO [2.511157751083374, 2.453148603439331, 11.416619300842285, 17.989450454711914, 0.6057873964309692, 27800, 9.652412478438153e-05] 2023-02-22 08:42:42,783 32k INFO ====> Epoch: 284 2023-02-22 08:44:28,460 32k INFO ====> Epoch: 285 2023-02-22 08:45:50,789 32k INFO Train Epoch: 286 [71%] 2023-02-22 08:45:50,789 32k INFO [2.419379472732544, 2.3715896606445312, 8.623769760131836, 15.313409805297852, 0.5617188811302185, 28000, 9.649999526137489e-05] 2023-02-22 08:45:55,739 32k INFO Saving model and optimizer state at iteration 286 to ./logs\32k\G_28000.pth 2023-02-22 08:46:09,553 32k INFO Saving model and optimizer state at iteration 286 to ./logs\32k\D_28000.pth 2023-02-22 08:46:37,027 32k INFO ====> Epoch: 286 2023-02-22 08:48:23,304 32k INFO ====> Epoch: 287 2023-02-22 08:49:49,517 32k INFO Train Epoch: 288 [76%] 2023-02-22 08:49:49,518 32k INFO [2.231926202774048, 2.57316255569458, 16.306894302368164, 19.932767868041992, 0.9207838773727417, 28200, 9.647587177037196e-05] 2023-02-22 08:50:09,780 32k INFO ====> Epoch: 288 2023-02-22 08:51:56,021 32k INFO ====> Epoch: 289 2023-02-22 08:53:25,110 32k INFO Train Epoch: 290 [80%] 2023-02-22 08:53:25,110 32k INFO [2.5337061882019043, 2.4603958129882812, 10.723613739013672, 16.972007751464844, 0.9773488640785217, 28400, 9.645175430986486e-05] 2023-02-22 08:53:41,993 32k INFO ====> Epoch: 290 2023-02-22 08:55:27,579 32k INFO ====> Epoch: 291 2023-02-22 08:57:00,042 32k INFO Train Epoch: 292 [84%] 2023-02-22 08:57:00,043 32k INFO [2.457164764404297, 2.5140910148620605, 12.109970092773438, 19.576763153076172, 0.6455909609794617, 28600, 9.642764287834605e-05] 2023-02-22 08:57:13,617 32k INFO ====> Epoch: 292 2023-02-22 08:58:59,241 32k INFO ====> Epoch: 293 2023-02-22 09:00:35,167 32k INFO Train Epoch: 294 [88%] 2023-02-22 09:00:35,167 32k INFO [3.1267271041870117, 1.7442620992660522, 5.101763725280762, 9.992081642150879, 0.8443043231964111, 28800, 9.640353747430838e-05] 2023-02-22 09:00:45,190 32k INFO ====> Epoch: 294 2023-02-22 09:02:30,950 32k INFO ====> Epoch: 295 2023-02-22 09:04:10,463 32k INFO Train Epoch: 296 [92%] 2023-02-22 09:04:10,463 32k INFO [2.4258294105529785, 2.153702735900879, 11.16634464263916, 18.30190658569336, 0.6709051728248596, 29000, 9.637943809624507e-05] 2023-02-22 09:04:14,772 32k INFO Saving model and optimizer state at iteration 296 to ./logs\32k\G_29000.pth 2023-02-22 09:04:30,920 32k INFO Saving model and optimizer state at iteration 296 to ./logs\32k\D_29000.pth 2023-02-22 09:04:41,056 32k INFO ====> Epoch: 296 2023-02-22 09:06:27,402 32k INFO ====> Epoch: 297 2023-02-22 09:08:10,882 32k INFO Train Epoch: 298 [96%] 2023-02-22 09:08:10,882 32k INFO [2.3428242206573486, 2.506030559539795, 9.039055824279785, 12.474124908447266, 0.5565758347511292, 29200, 9.635534474264972e-05] 2023-02-22 09:08:13,993 32k INFO ====> Epoch: 298 2023-02-22 09:10:00,361 32k INFO ====> Epoch: 299 2023-02-22 09:11:46,836 32k INFO ====> Epoch: 300 2023-02-22 09:12:10,849 32k INFO Train Epoch: 301 [0%] 2023-02-22 09:12:10,849 32k INFO [2.0102734565734863, 3.098827838897705, 12.834500312805176, 17.658754348754883, 0.8110590577125549, 29400, 9.631921600483981e-05] 2023-02-22 09:13:34,929 32k INFO ====> Epoch: 301 2023-02-22 09:15:22,579 32k INFO ====> Epoch: 302 2023-02-22 09:15:48,125 32k INFO Train Epoch: 303 [4%] 2023-02-22 09:15:48,125 32k INFO [2.292261838912964, 2.6530299186706543, 15.810569763183594, 19.79923439025879, 0.8924477100372314, 29600, 9.629513770582634e-05] 2023-02-22 09:17:08,489 32k INFO ====> Epoch: 303 2023-02-22 09:18:54,037 32k INFO ====> Epoch: 304 2023-02-22 09:19:22,932 32k INFO Train Epoch: 305 [8%] 2023-02-22 09:19:22,933 32k INFO [2.2535953521728516, 2.7421841621398926, 14.76373291015625, 17.443092346191406, 0.9546372294425964, 29800, 9.627106542601141e-05] 2023-02-22 09:20:39,815 32k INFO ====> Epoch: 305 2023-02-22 09:22:25,311 32k INFO ====> Epoch: 306 2023-02-22 09:22:57,677 32k INFO Train Epoch: 307 [12%] 2023-02-22 09:22:57,677 32k INFO [2.326064109802246, 2.3965041637420654, 7.343259811401367, 12.741463661193848, 0.6292210817337036, 30000, 9.62469991638903e-05] 2023-02-22 09:23:01,921 32k INFO Saving model and optimizer state at iteration 307 to ./logs\32k\G_30000.pth 2023-02-22 09:23:20,758 32k INFO Saving model and optimizer state at iteration 307 to ./logs\32k\D_30000.pth 2023-02-22 09:24:37,562 32k INFO ====> Epoch: 307 2023-02-22 09:26:23,740 32k INFO ====> Epoch: 308 2023-02-22 09:26:59,601 32k INFO Train Epoch: 309 [16%] 2023-02-22 09:26:59,601 32k INFO [2.4903805255889893, 2.4432358741760254, 9.264269828796387, 16.382070541381836, 0.8371724486351013, 30200, 9.622293891795867e-05] 2023-02-22 09:28:09,664 32k INFO ====> Epoch: 309 2023-02-22 09:29:55,317 32k INFO ====> Epoch: 310 2023-02-22 09:30:34,487 32k INFO Train Epoch: 311 [20%] 2023-02-22 09:30:34,487 32k INFO [2.6089067459106445, 2.180260181427002, 6.335816860198975, 10.976472854614258, 0.98526930809021, 30400, 9.619888468671259e-05] 2023-02-22 09:31:41,120 32k INFO ====> Epoch: 311 2023-02-22 09:33:27,469 32k INFO ====> Epoch: 312 2023-02-22 09:34:10,970 32k INFO Train Epoch: 313 [24%] 2023-02-22 09:34:10,970 32k INFO [2.2308130264282227, 2.525303363800049, 13.083917617797852, 19.732345581054688, 0.6941409707069397, 30600, 9.617483646864849e-05] 2023-02-22 09:35:14,173 32k INFO ====> Epoch: 313 2023-02-22 09:36:59,697 32k INFO ====> Epoch: 314 2023-02-22 09:37:45,817 32k INFO Train Epoch: 315 [29%] 2023-02-22 09:37:45,818 32k INFO [2.5213077068328857, 2.46886944770813, 10.760324478149414, 19.334060668945312, 0.5704622268676758, 30800, 9.615079426226314e-05] 2023-02-22 09:38:45,598 32k INFO ====> Epoch: 315 2023-02-22 09:40:31,200 32k INFO ====> Epoch: 316 2023-02-22 09:41:21,493 32k INFO Train Epoch: 317 [33%] 2023-02-22 09:41:21,493 32k INFO [2.1243271827697754, 2.5788512229919434, 12.918538093566895, 20.69605827331543, 0.9645518064498901, 31000, 9.612675806605373e-05] 2023-02-22 09:41:26,457 32k INFO Saving model and optimizer state at iteration 317 to ./logs\32k\G_31000.pth 2023-02-22 09:41:44,385 32k INFO Saving model and optimizer state at iteration 317 to ./logs\32k\D_31000.pth 2023-02-22 09:42:44,191 32k INFO ====> Epoch: 317 2023-02-22 09:44:30,505 32k INFO ====> Epoch: 318 2023-02-22 09:45:24,342 32k INFO Train Epoch: 319 [37%] 2023-02-22 09:45:24,343 32k INFO [2.4621365070343018, 2.3512632846832275, 11.532454490661621, 16.673147201538086, 1.0516586303710938, 31200, 9.61027278785178e-05] 2023-02-22 09:46:17,300 32k INFO ====> Epoch: 319 2023-02-22 09:48:02,873 32k INFO ====> Epoch: 320 2023-02-22 09:48:59,267 32k INFO Train Epoch: 321 [41%] 2023-02-22 09:48:59,267 32k INFO [2.293076992034912, 2.460491180419922, 15.437559127807617, 19.99898910522461, 0.8674603700637817, 31400, 9.60787036981533e-05] 2023-02-22 09:49:48,764 32k INFO ====> Epoch: 321 2023-02-22 09:51:34,708 32k INFO ====> Epoch: 322 2023-02-22 09:52:34,630 32k INFO Train Epoch: 323 [45%] 2023-02-22 09:52:34,630 32k INFO [2.3757669925689697, 2.392949104309082, 14.556235313415527, 18.955732345581055, 0.8770878911018372, 31600, 9.60546855234585e-05] 2023-02-22 09:53:20,914 32k INFO ====> Epoch: 323 2023-02-22 09:55:07,269 32k INFO ====> Epoch: 324 2023-02-22 09:56:10,506 32k INFO Train Epoch: 325 [49%] 2023-02-22 09:56:10,507 32k INFO [2.3360633850097656, 2.545222520828247, 11.59161376953125, 17.103670120239258, 0.7123858332633972, 31800, 9.603067335293209e-05] 2023-02-22 09:56:53,220 32k INFO ====> Epoch: 325 2023-02-22 09:58:38,902 32k INFO ====> Epoch: 326 2023-02-22 09:59:45,549 32k INFO Train Epoch: 327 [53%] 2023-02-22 09:59:45,549 32k INFO [2.3887226581573486, 2.382225275039673, 9.169524192810059, 15.562906265258789, 0.9378204345703125, 32000, 9.600666718507311e-05] 2023-02-22 09:59:50,519 32k INFO Saving model and optimizer state at iteration 327 to ./logs\32k\G_32000.pth 2023-02-22 10:00:08,092 32k INFO Saving model and optimizer state at iteration 327 to ./logs\32k\D_32000.pth 2023-02-22 10:00:50,689 32k INFO ====> Epoch: 327 2023-02-22 10:02:36,996 32k INFO ====> Epoch: 328 2023-02-22 10:03:47,898 32k INFO Train Epoch: 329 [57%] 2023-02-22 10:03:47,899 32k INFO [2.1435623168945312, 2.491894483566284, 13.545025825500488, 17.95233726501465, 0.8276446461677551, 32200, 9.5982667018381e-05] 2023-02-22 10:04:23,585 32k INFO ====> Epoch: 329 2023-02-22 10:06:09,160 32k INFO ====> Epoch: 330 2023-02-22 10:07:22,888 32k INFO Train Epoch: 331 [61%] 2023-02-22 10:07:22,889 32k INFO [2.53352952003479, 2.79293155670166, 10.536802291870117, 17.276477813720703, 0.7348306179046631, 32400, 9.595867285135558e-05] 2023-02-22 10:07:55,166 32k INFO ====> Epoch: 331 2023-02-22 10:09:54,170 32k INFO ====> Epoch: 332 2023-02-22 10:11:11,362 32k INFO Train Epoch: 333 [65%] 2023-02-22 10:11:11,362 32k INFO [2.3337302207946777, 2.2436773777008057, 11.43968391418457, 16.059717178344727, 0.6464142203330994, 32600, 9.5934684682497e-05] 2023-02-22 10:11:40,327 32k INFO ====> Epoch: 333 2023-02-22 10:13:26,239 32k INFO ====> Epoch: 334 2023-02-22 10:14:46,735 32k INFO Train Epoch: 335 [69%] 2023-02-22 10:14:46,736 32k INFO [2.341135025024414, 2.178755044937134, 12.009360313415527, 17.662372589111328, 0.6575217247009277, 32800, 9.591070251030582e-05] 2023-02-22 10:15:12,226 32k INFO ====> Epoch: 335 2023-02-22 10:16:58,071 32k INFO ====> Epoch: 336 2023-02-22 10:18:22,210 32k INFO Train Epoch: 337 [73%] 2023-02-22 10:18:22,210 32k INFO [2.5522103309631348, 2.188314914703369, 10.32480525970459, 15.917875289916992, 0.5481040477752686, 33000, 9.588672633328296e-05] 2023-02-22 10:18:26,497 32k INFO Saving model and optimizer state at iteration 337 to ./logs\32k\G_33000.pth 2023-02-22 10:18:44,754 32k INFO Saving model and optimizer state at iteration 337 to ./logs\32k\D_33000.pth 2023-02-22 10:19:10,032 32k INFO ====> Epoch: 337 2023-02-22 10:20:56,519 32k INFO ====> Epoch: 338 2023-02-22 10:22:23,859 32k INFO Train Epoch: 339 [78%] 2023-02-22 10:22:23,859 32k INFO [2.4190073013305664, 2.2712206840515137, 11.071868896484375, 18.608346939086914, 1.2736413478851318, 33200, 9.586275614992974e-05] 2023-02-22 10:22:42,496 32k INFO ====> Epoch: 339 2023-02-22 10:24:28,908 32k INFO ====> Epoch: 340 2023-02-22 10:26:00,569 32k INFO Train Epoch: 341 [82%] 2023-02-22 10:26:00,569 32k INFO [2.5759196281433105, 2.485172748565674, 11.334334373474121, 16.11148452758789, 0.9113945364952087, 33400, 9.583879195874782e-05] 2023-02-22 10:26:15,792 32k INFO ====> Epoch: 341 2023-02-22 10:28:02,420 32k INFO ====> Epoch: 342 2023-02-22 10:29:37,733 32k INFO Train Epoch: 343 [86%] 2023-02-22 10:29:37,733 32k INFO [2.478440046310425, 2.6277501583099365, 11.949339866638184, 17.164081573486328, 0.7420394420623779, 33600, 9.581483375823925e-05] 2023-02-22 10:29:49,553 32k INFO ====> Epoch: 343 2023-02-22 10:31:35,370 32k INFO ====> Epoch: 344 2023-02-22 10:33:13,278 32k INFO Train Epoch: 345 [90%] 2023-02-22 10:33:13,278 32k INFO [2.514538288116455, 2.42301344871521, 11.596924781799316, 17.29451560974121, 0.6897502541542053, 33800, 9.579088154690645e-05] 2023-02-22 10:33:21,550 32k INFO ====> Epoch: 345 2023-02-22 10:35:07,455 32k INFO ====> Epoch: 346 2023-02-22 10:36:48,752 32k INFO Train Epoch: 347 [94%] 2023-02-22 10:36:48,752 32k INFO [2.37970232963562, 2.336655378341675, 12.431614875793457, 18.605751037597656, 0.8811375498771667, 34000, 9.576693532325224e-05] 2023-02-22 10:36:53,130 32k INFO Saving model and optimizer state at iteration 347 to ./logs\32k\G_34000.pth 2023-02-22 10:37:10,041 32k INFO Saving model and optimizer state at iteration 347 to ./logs\32k\D_34000.pth 2023-02-22 10:37:18,304 32k INFO ====> Epoch: 347 2023-02-22 10:39:05,169 32k INFO ====> Epoch: 348 2023-02-22 10:40:49,972 32k INFO Train Epoch: 349 [98%] 2023-02-22 10:40:49,972 32k INFO [2.4378786087036133, 2.316551446914673, 10.715349197387695, 14.96591854095459, 0.5559961199760437, 34200, 9.574299508577979e-05] 2023-02-22 10:40:51,354 32k INFO ====> Epoch: 349 2023-02-22 10:42:37,890 32k INFO ====> Epoch: 350 2023-02-22 10:44:24,654 32k INFO ====> Epoch: 351 2023-02-22 10:44:49,299 32k INFO Train Epoch: 352 [2%] 2023-02-22 10:44:49,300 32k INFO [2.4335520267486572, 2.271144151687622, 9.197824478149414, 14.361316680908203, 0.9935851693153381, 34400, 9.570709595038851e-05] 2023-02-22 10:46:11,605 32k INFO ====> Epoch: 352 2023-02-22 10:47:58,244 32k INFO ====> Epoch: 353 2023-02-22 10:48:25,603 32k INFO Train Epoch: 354 [6%] 2023-02-22 10:48:25,604 32k INFO [2.503721237182617, 2.299598217010498, 11.784101486206055, 17.45063018798828, 0.7507224678993225, 34600, 9.568317067182427e-05] 2023-02-22 10:49:44,510 32k INFO ====> Epoch: 354 2023-02-22 10:51:30,297 32k INFO ====> Epoch: 355 2023-02-22 10:52:01,064 32k INFO Train Epoch: 356 [10%] 2023-02-22 10:52:01,064 32k INFO [2.358137369155884, 2.391530990600586, 12.17978572845459, 17.948204040527344, 0.6102503538131714, 34800, 9.565925137420586e-05] 2023-02-22 10:53:16,260 32k INFO ====> Epoch: 356 2023-02-22 10:55:05,634 32k INFO ====> Epoch: 357 2023-02-22 10:55:40,768 32k INFO Train Epoch: 358 [14%] 2023-02-22 10:55:40,768 32k INFO [2.742542028427124, 2.216801166534424, 8.76644515991211, 14.839630126953125, 0.49059024453163147, 35000, 9.56353380560381e-05] 2023-02-22 10:55:45,046 32k INFO Saving model and optimizer state at iteration 358 to ./logs\32k\G_35000.pth 2023-02-22 10:56:01,246 32k INFO Saving model and optimizer state at iteration 358 to ./logs\32k\D_35000.pth 2023-02-22 10:57:17,536 32k INFO ====> Epoch: 358