950 lines
20 KiB
Plaintext
950 lines
20 KiB
Plaintext
nohup: ignoring input
|
|
Setting up data...
|
|
Starting training...
|
|
----------
|
|
Epoch: 1/300
|
|
nohup: ignoring input
|
|
Setting up data...
|
|
Starting training...
|
|
----------
|
|
Epoch: 1/300
|
|
nohup: ignoring input
|
|
Traceback (most recent call last):
|
|
File "main.py", line 73, in <module>
|
|
ctrbox_obj.train_network(args)
|
|
File "/home/thsw/WJ/nyh/CODE/bba_vector/BBAVectors-Oriented-Object-Detection/train.py", line 101, in train_network
|
|
self.model.to(self.device)
|
|
File "/home/thsw/anaconda3/envs/yolov5_bridge/lib/python3.8/site-packages/torch/nn/modules/module.py", line 852, in to
|
|
return self._apply(convert)
|
|
File "/home/thsw/anaconda3/envs/yolov5_bridge/lib/python3.8/site-packages/torch/nn/modules/module.py", line 530, in _apply
|
|
module._apply(fn)
|
|
File "/home/thsw/anaconda3/envs/yolov5_bridge/lib/python3.8/site-packages/torch/nn/modules/module.py", line 530, in _apply
|
|
module._apply(fn)
|
|
File "/home/thsw/anaconda3/envs/yolov5_bridge/lib/python3.8/site-packages/torch/nn/modules/module.py", line 552, in _apply
|
|
param_applied = fn(param)
|
|
File "/home/thsw/anaconda3/envs/yolov5_bridge/lib/python3.8/site-packages/torch/nn/modules/module.py", line 850, in convert
|
|
return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
|
|
RuntimeError: CUDA error: out of memory
|
|
CUDA kernel errors might be asynchronously reported at some other API call,so the stacktrace below might be incorrect.
|
|
For debugging consider passing CUDA_LAUNCH_BLOCKING=1.
|
|
Traceback (most recent call last):
|
|
File "main.py", line 73, in <module>
|
|
ctrbox_obj.train_network(args)
|
|
File "/home/thsw/WJ/nyh/CODE/bba_vector/BBAVectors-Oriented-Object-Detection/train.py", line 130, in train_network
|
|
epoch_loss = self.run_epoch(phase='train',
|
|
File "/home/thsw/WJ/nyh/CODE/bba_vector/BBAVectors-Oriented-Object-Detection/train.py", line 166, in run_epoch
|
|
pr_decs = self.model(data_dict['input'])
|
|
File "/home/thsw/anaconda3/envs/yolov5_bridge/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1051, in _call_impl
|
|
return forward_call(*input, **kwargs)
|
|
File "/home/thsw/WJ/nyh/CODE/bba_vector/BBAVectors-Oriented-Object-Detection/models/ctrbox_net.py", line 81, in forward
|
|
c3_combine = self.dec_c3(c4_combine, x[-3])
|
|
File "/home/thsw/anaconda3/envs/yolov5_bridge/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1051, in _call_impl
|
|
return forward_call(*input, **kwargs)
|
|
File "/home/thsw/WJ/nyh/CODE/bba_vector/BBAVectors-Oriented-Object-Detection/models/model_parts.py", line 37, in forward
|
|
return self.cat_conv(torch.cat((x_up, x_low), 1))
|
|
RuntimeError: CUDA out of memory. Tried to allocate 182.00 MiB (GPU 0; 23.69 GiB total capacity; 5.63 GiB already allocated; 30.12 MiB free; 5.85 GiB reserved in total by PyTorch)
|
|
train loss: 3.3678845995809974
|
|
----------
|
|
Epoch: 2/300
|
|
/home/thsw/anaconda3/envs/yolov5_bridge/lib/python3.8/site-packages/torch/optim/lr_scheduler.py:154: UserWarning: The epoch parameter in `scheduler.step()` was not necessary and is being deprecated where possible. Please use `scheduler.step()` to step the scheduler. During the deprecation, if epoch is different from None, the closed form is used instead of the new chainable form, where available. Please open an issue if you are unable to replicate your use case: https://github.com/pytorch/pytorch/issues/new/choose.
|
|
warnings.warn(EPOCH_DEPRECATION_WARNING, UserWarning)
|
|
train loss: 1.8854694141120445
|
|
----------
|
|
Epoch: 3/300
|
|
train loss: 1.5767962481917404
|
|
----------
|
|
Epoch: 4/300
|
|
train loss: 1.4255406602126797
|
|
----------
|
|
Epoch: 5/300
|
|
train loss: 1.3310559519180438
|
|
----------
|
|
Epoch: 6/300
|
|
train loss: 1.2327631359420173
|
|
----------
|
|
Epoch: 7/300
|
|
train loss: 1.179566274510651
|
|
----------
|
|
Epoch: 8/300
|
|
train loss: 1.1194112766079787
|
|
----------
|
|
Epoch: 9/300
|
|
train loss: 1.0934214072256554
|
|
----------
|
|
Epoch: 10/300
|
|
train loss: 1.0598071190278704
|
|
----------
|
|
Epoch: 11/300
|
|
train loss: 1.0322292461627867
|
|
----------
|
|
Epoch: 12/300
|
|
train loss: 1.0056840393964837
|
|
----------
|
|
Epoch: 13/300
|
|
train loss: 0.978235443009109
|
|
----------
|
|
Epoch: 14/300
|
|
train loss: 0.9646336492605325
|
|
----------
|
|
Epoch: 15/300
|
|
train loss: 0.9264216072312216
|
|
----------
|
|
Epoch: 16/300
|
|
train loss: 0.9201980442172144
|
|
----------
|
|
Epoch: 17/300
|
|
train loss: 0.9013974646606097
|
|
----------
|
|
Epoch: 18/300
|
|
train loss: 0.8931510070475136
|
|
----------
|
|
Epoch: 19/300
|
|
train loss: 0.872543891755546
|
|
----------
|
|
Epoch: 20/300
|
|
train loss: 0.8504351748198997
|
|
----------
|
|
Epoch: 21/300
|
|
train loss: 0.8342699241347429
|
|
----------
|
|
Epoch: 22/300
|
|
train loss: 0.8266373327592524
|
|
----------
|
|
Epoch: 23/300
|
|
train loss: 0.8242110590018877
|
|
----------
|
|
Epoch: 24/300
|
|
train loss: 0.7909619433850776
|
|
----------
|
|
Epoch: 25/300
|
|
train loss: 0.7895614755589787
|
|
----------
|
|
Epoch: 26/300
|
|
train loss: 0.7839372414277821
|
|
----------
|
|
Epoch: 27/300
|
|
train loss: 0.7723920864121216
|
|
----------
|
|
Epoch: 28/300
|
|
train loss: 0.7776255176925078
|
|
----------
|
|
Epoch: 29/300
|
|
train loss: 0.7548159630742015
|
|
----------
|
|
Epoch: 30/300
|
|
train loss: 0.7567941827199808
|
|
----------
|
|
Epoch: 31/300
|
|
train loss: 0.7433899990850832
|
|
----------
|
|
Epoch: 32/300
|
|
train loss: 0.7302218635634679
|
|
----------
|
|
Epoch: 33/300
|
|
train loss: 0.7302329011443185
|
|
----------
|
|
Epoch: 34/300
|
|
train loss: 0.7236013530594546
|
|
----------
|
|
Epoch: 35/300
|
|
train loss: 0.7173745734844266
|
|
----------
|
|
Epoch: 36/300
|
|
train loss: 0.6950798584375439
|
|
----------
|
|
Epoch: 37/300
|
|
train loss: 0.7018409407720333
|
|
----------
|
|
Epoch: 38/300
|
|
train loss: 0.6869057262452637
|
|
----------
|
|
Epoch: 39/300
|
|
train loss: 0.6841795565333308
|
|
----------
|
|
Epoch: 40/300
|
|
train loss: 0.6794473394388105
|
|
----------
|
|
Epoch: 41/300
|
|
train loss: 0.6790968551323181
|
|
----------
|
|
Epoch: 42/300
|
|
train loss: 0.6754620320549826
|
|
----------
|
|
Epoch: 43/300
|
|
train loss: 0.665113363687585
|
|
----------
|
|
Epoch: 44/300
|
|
train loss: 0.667056881618209
|
|
----------
|
|
Epoch: 45/300
|
|
train loss: 0.6623089440712114
|
|
----------
|
|
Epoch: 46/300
|
|
train loss: 0.6524794242549233
|
|
----------
|
|
Epoch: 47/300
|
|
train loss: 0.6567894975404914
|
|
----------
|
|
Epoch: 48/300
|
|
train loss: 0.6450830041090163
|
|
----------
|
|
Epoch: 49/300
|
|
train loss: 0.6330659195053868
|
|
----------
|
|
Epoch: 50/300
|
|
train loss: 0.6390185023589832
|
|
----------
|
|
Epoch: 51/300
|
|
train loss: 0.6335206407054168
|
|
----------
|
|
Epoch: 52/300
|
|
train loss: 0.6487786481838401
|
|
----------
|
|
Epoch: 53/300
|
|
train loss: 0.6242503134034029
|
|
----------
|
|
Epoch: 54/300
|
|
train loss: 0.629002622987439
|
|
----------
|
|
Epoch: 55/300
|
|
train loss: 0.6285576712249256
|
|
----------
|
|
Epoch: 56/300
|
|
train loss: 0.6151385171020903
|
|
----------
|
|
Epoch: 57/300
|
|
train loss: 0.618396897959273
|
|
----------
|
|
Epoch: 58/300
|
|
train loss: 0.6186871595862435
|
|
----------
|
|
Epoch: 59/300
|
|
train loss: 0.6152329195926829
|
|
----------
|
|
Epoch: 60/300
|
|
train loss: 0.6118716762923613
|
|
----------
|
|
Epoch: 61/300
|
|
train loss: 0.6106682838644923
|
|
----------
|
|
Epoch: 62/300
|
|
train loss: 0.605608502084889
|
|
----------
|
|
Epoch: 63/300
|
|
train loss: 0.6064907143392214
|
|
----------
|
|
Epoch: 64/300
|
|
train loss: 0.6069014861998034
|
|
----------
|
|
Epoch: 65/300
|
|
train loss: 0.608613460256559
|
|
----------
|
|
Epoch: 66/300
|
|
train loss: 0.6001394573689961
|
|
----------
|
|
Epoch: 67/300
|
|
train loss: 0.6002942952441006
|
|
----------
|
|
Epoch: 68/300
|
|
train loss: 0.5860573515841146
|
|
----------
|
|
Epoch: 69/300
|
|
train loss: 0.5889460247282575
|
|
----------
|
|
Epoch: 70/300
|
|
train loss: 0.5920564295133439
|
|
----------
|
|
Epoch: 71/300
|
|
train loss: 0.5855444338081813
|
|
----------
|
|
Epoch: 72/300
|
|
train loss: 0.5848859441353054
|
|
----------
|
|
Epoch: 73/300
|
|
train loss: 0.5868136161347715
|
|
----------
|
|
Epoch: 74/300
|
|
train loss: 0.593527490772852
|
|
----------
|
|
Epoch: 75/300
|
|
train loss: 0.5777589420719844
|
|
----------
|
|
Epoch: 76/300
|
|
train loss: 0.5826347842812538
|
|
----------
|
|
Epoch: 77/300
|
|
train loss: 0.5838093518665651
|
|
----------
|
|
Epoch: 78/300
|
|
train loss: 0.5841529097680639
|
|
----------
|
|
Epoch: 79/300
|
|
train loss: 0.5777867313746999
|
|
----------
|
|
Epoch: 80/300
|
|
train loss: 0.5788609479076978
|
|
----------
|
|
Epoch: 81/300
|
|
train loss: 0.5771293887277928
|
|
----------
|
|
Epoch: 82/300
|
|
train loss: 0.5715940859077907
|
|
----------
|
|
Epoch: 83/300
|
|
train loss: 0.5743454047819463
|
|
----------
|
|
Epoch: 84/300
|
|
train loss: 0.5756655906940379
|
|
----------
|
|
Epoch: 85/300
|
|
train loss: 0.5661546761488042
|
|
----------
|
|
Epoch: 86/300
|
|
train loss: 0.5686924996354231
|
|
----------
|
|
Epoch: 87/300
|
|
train loss: 0.572288849731771
|
|
----------
|
|
Epoch: 88/300
|
|
train loss: 0.5718413282458376
|
|
----------
|
|
Epoch: 89/300
|
|
train loss: 0.5705293969410222
|
|
----------
|
|
Epoch: 90/300
|
|
train loss: 0.5671022519832705
|
|
----------
|
|
Epoch: 91/300
|
|
train loss: 0.5707364989126601
|
|
----------
|
|
Epoch: 92/300
|
|
train loss: 0.567001880759873
|
|
----------
|
|
Epoch: 93/300
|
|
train loss: 0.5718449552248164
|
|
----------
|
|
Epoch: 94/300
|
|
train loss: 0.5605094927113231
|
|
----------
|
|
Epoch: 95/300
|
|
train loss: 0.5588691746134583
|
|
----------
|
|
Epoch: 96/300
|
|
train loss: 0.563779655571391
|
|
----------
|
|
Epoch: 97/300
|
|
train loss: 0.5727203829986293
|
|
----------
|
|
Epoch: 98/300
|
|
train loss: 0.5606645670060705
|
|
----------
|
|
Epoch: 99/300
|
|
train loss: 0.5655639344235746
|
|
----------
|
|
Epoch: 100/300
|
|
train loss: 0.5642829932635878
|
|
----------
|
|
Epoch: 101/300
|
|
train loss: 0.5613318870707256
|
|
----------
|
|
Epoch: 102/300
|
|
train loss: 0.559509095985715
|
|
----------
|
|
Epoch: 103/300
|
|
train loss: 0.5671124627313963
|
|
----------
|
|
Epoch: 104/300
|
|
train loss: 0.5616818245772909
|
|
----------
|
|
Epoch: 105/300
|
|
train loss: 0.5619957823215461
|
|
----------
|
|
Epoch: 106/300
|
|
train loss: 0.5567884017236349
|
|
----------
|
|
Epoch: 107/300
|
|
train loss: 0.5557383496950312
|
|
----------
|
|
Epoch: 108/300
|
|
train loss: 0.5642311291360274
|
|
----------
|
|
Epoch: 109/300
|
|
train loss: 0.5662872087119556
|
|
----------
|
|
Epoch: 110/300
|
|
train loss: 0.5590864852434252
|
|
----------
|
|
Epoch: 111/300
|
|
train loss: 0.5648398053173612
|
|
----------
|
|
Epoch: 112/300
|
|
train loss: 0.5580323726483961
|
|
----------
|
|
Epoch: 113/300
|
|
train loss: 0.5567497580153186
|
|
----------
|
|
Epoch: 114/300
|
|
train loss: 0.5630400580603901
|
|
----------
|
|
Epoch: 115/300
|
|
train loss: 0.5538424374308528
|
|
----------
|
|
Epoch: 116/300
|
|
train loss: 0.5580180316436582
|
|
----------
|
|
Epoch: 117/300
|
|
train loss: 0.556875641026148
|
|
----------
|
|
Epoch: 118/300
|
|
train loss: 0.5678275541016241
|
|
----------
|
|
Epoch: 119/300
|
|
train loss: 0.560026458850721
|
|
----------
|
|
Epoch: 120/300
|
|
train loss: 0.5611625352828968
|
|
----------
|
|
Epoch: 121/300
|
|
train loss: 0.551834630530055
|
|
----------
|
|
Epoch: 122/300
|
|
train loss: 0.5544438309422354
|
|
----------
|
|
Epoch: 123/300
|
|
train loss: 0.5569952729998565
|
|
----------
|
|
Epoch: 124/300
|
|
train loss: 0.5532170316976744
|
|
----------
|
|
Epoch: 125/300
|
|
train loss: 0.5545102213395805
|
|
----------
|
|
Epoch: 126/300
|
|
train loss: 0.5570076818509799
|
|
----------
|
|
Epoch: 127/300
|
|
train loss: 0.5552769386913718
|
|
----------
|
|
Epoch: 128/300
|
|
train loss: 0.555146463942237
|
|
----------
|
|
Epoch: 129/300
|
|
train loss: 0.5533929106484099
|
|
----------
|
|
Epoch: 130/300
|
|
train loss: 0.5599716932671827
|
|
----------
|
|
Epoch: 131/300
|
|
train loss: 0.5522712502719426
|
|
----------
|
|
Epoch: 132/300
|
|
train loss: 0.5526935269192952
|
|
----------
|
|
Epoch: 133/300
|
|
train loss: 0.5498186174507548
|
|
----------
|
|
Epoch: 134/300
|
|
train loss: 0.558319001663022
|
|
----------
|
|
Epoch: 135/300
|
|
train loss: 0.567952715678186
|
|
----------
|
|
Epoch: 136/300
|
|
train loss: 0.5496662398118798
|
|
----------
|
|
Epoch: 137/300
|
|
train loss: 0.5459267346233856
|
|
----------
|
|
Epoch: 138/300
|
|
train loss: 0.5548020657606241
|
|
----------
|
|
Epoch: 139/300
|
|
train loss: 0.5581261336621715
|
|
----------
|
|
Epoch: 140/300
|
|
train loss: 0.555513132272697
|
|
----------
|
|
Epoch: 141/300
|
|
train loss: 0.5425212532281876
|
|
----------
|
|
Epoch: 142/300
|
|
train loss: 0.554408071335496
|
|
----------
|
|
Epoch: 143/300
|
|
train loss: 0.5611961568455871
|
|
----------
|
|
Epoch: 144/300
|
|
train loss: 0.5497159719830607
|
|
----------
|
|
Epoch: 145/300
|
|
train loss: 0.553985515291371
|
|
----------
|
|
Epoch: 146/300
|
|
train loss: 0.552209698027227
|
|
----------
|
|
Epoch: 147/300
|
|
train loss: 0.5482723026922564
|
|
----------
|
|
Epoch: 148/300
|
|
train loss: 0.5597864095030761
|
|
----------
|
|
Epoch: 149/300
|
|
train loss: 0.5535240913854866
|
|
----------
|
|
Epoch: 150/300
|
|
train loss: 0.5547397449249174
|
|
----------
|
|
Epoch: 151/300
|
|
train loss: 0.5516444017610899
|
|
----------
|
|
Epoch: 152/300
|
|
train loss: 0.5467582678467762
|
|
----------
|
|
Epoch: 153/300
|
|
train loss: 0.5556213542273859
|
|
----------
|
|
Epoch: 154/300
|
|
train loss: 0.5559055385429684
|
|
----------
|
|
Epoch: 155/300
|
|
train loss: 0.5527037072290735
|
|
----------
|
|
Epoch: 156/300
|
|
train loss: 0.5595959684834247
|
|
----------
|
|
Epoch: 157/300
|
|
train loss: 0.5552922378226024
|
|
----------
|
|
Epoch: 158/300
|
|
train loss: 0.5588282246778651
|
|
----------
|
|
Epoch: 159/300
|
|
train loss: 0.5568678265482914
|
|
----------
|
|
Epoch: 160/300
|
|
train loss: 0.5509358892535291
|
|
----------
|
|
Epoch: 161/300
|
|
train loss: 0.55544380798209
|
|
----------
|
|
Epoch: 162/300
|
|
train loss: 0.5503908054130834
|
|
----------
|
|
Epoch: 163/300
|
|
train loss: 0.5490232755134745
|
|
----------
|
|
Epoch: 164/300
|
|
train loss: 0.55477098148407
|
|
----------
|
|
Epoch: 165/300
|
|
train loss: 0.5499786233938322
|
|
----------
|
|
Epoch: 166/300
|
|
train loss: 0.5489160029626474
|
|
----------
|
|
Epoch: 167/300
|
|
train loss: 0.5535752333518935
|
|
----------
|
|
Epoch: 168/300
|
|
train loss: 0.5517634667637872
|
|
----------
|
|
Epoch: 169/300
|
|
train loss: 0.5544024943941976
|
|
----------
|
|
Epoch: 170/300
|
|
train loss: 0.5567889806882638
|
|
----------
|
|
Epoch: 171/300
|
|
train loss: 0.5537487953537847
|
|
----------
|
|
Epoch: 172/300
|
|
train loss: 0.5505024468208232
|
|
----------
|
|
Epoch: 173/300
|
|
train loss: 0.5541055284258796
|
|
----------
|
|
Epoch: 174/300
|
|
train loss: 0.5564219584552254
|
|
----------
|
|
Epoch: 175/300
|
|
train loss: 0.5509056932315594
|
|
----------
|
|
Epoch: 176/300
|
|
train loss: 0.5506802266690789
|
|
----------
|
|
Epoch: 177/300
|
|
train loss: 0.5518694978843375
|
|
----------
|
|
Epoch: 178/300
|
|
train loss: 0.5442488067215536
|
|
----------
|
|
Epoch: 179/300
|
|
train loss: 0.5474909305027346
|
|
----------
|
|
Epoch: 180/300
|
|
train loss: 0.5438729580946084
|
|
----------
|
|
Epoch: 181/300
|
|
train loss: 0.5497829824140886
|
|
----------
|
|
Epoch: 182/300
|
|
train loss: 0.5528207572131623
|
|
----------
|
|
Epoch: 183/300
|
|
train loss: 0.5478700979090319
|
|
----------
|
|
Epoch: 184/300
|
|
train loss: 0.5556103761603193
|
|
----------
|
|
Epoch: 185/300
|
|
train loss: 0.5493665297583836
|
|
----------
|
|
Epoch: 186/300
|
|
train loss: 0.5469489639125219
|
|
----------
|
|
Epoch: 187/300
|
|
train loss: 0.5473163739391943
|
|
----------
|
|
Epoch: 188/300
|
|
train loss: 0.5498563008519207
|
|
----------
|
|
Epoch: 189/300
|
|
train loss: 0.5418999908355678
|
|
----------
|
|
Epoch: 190/300
|
|
train loss: 0.5562262407890181
|
|
----------
|
|
Epoch: 191/300
|
|
train loss: 0.5534045947034184
|
|
----------
|
|
Epoch: 192/300
|
|
train loss: 0.5485934589694186
|
|
----------
|
|
Epoch: 193/300
|
|
train loss: 0.5419396719918018
|
|
----------
|
|
Epoch: 194/300
|
|
train loss: 0.5514212181655372
|
|
----------
|
|
Epoch: 195/300
|
|
train loss: 0.5535368068007435
|
|
----------
|
|
Epoch: 196/300
|
|
train loss: 0.5464842418526731
|
|
----------
|
|
Epoch: 197/300
|
|
train loss: 0.5495766509415173
|
|
----------
|
|
Epoch: 198/300
|
|
train loss: 0.5532617549888972
|
|
----------
|
|
Epoch: 199/300
|
|
train loss: 0.5452990282054354
|
|
----------
|
|
Epoch: 200/300
|
|
train loss: 0.553361903966927
|
|
----------
|
|
Epoch: 201/300
|
|
train loss: 0.5432001394106121
|
|
----------
|
|
Epoch: 202/300
|
|
train loss: 0.550419543574496
|
|
----------
|
|
Epoch: 203/300
|
|
train loss: 0.5534595720833395
|
|
----------
|
|
Epoch: 204/300
|
|
train loss: 0.5453232568575115
|
|
----------
|
|
Epoch: 205/300
|
|
train loss: 0.5451060552604314
|
|
----------
|
|
Epoch: 206/300
|
|
train loss: 0.5556593400130911
|
|
----------
|
|
Epoch: 207/300
|
|
train loss: 0.557492576721238
|
|
----------
|
|
Epoch: 208/300
|
|
train loss: 0.5553175620734692
|
|
----------
|
|
Epoch: 209/300
|
|
train loss: 0.5433410697775644
|
|
----------
|
|
Epoch: 210/300
|
|
train loss: 0.5517896733632902
|
|
----------
|
|
Epoch: 211/300
|
|
train loss: 0.5420230569817671
|
|
----------
|
|
Epoch: 212/300
|
|
train loss: 0.553179111422562
|
|
----------
|
|
Epoch: 213/300
|
|
train loss: 0.5532874674877015
|
|
----------
|
|
Epoch: 214/300
|
|
train loss: 0.5513115395314809
|
|
----------
|
|
Epoch: 215/300
|
|
train loss: 0.5477715160061674
|
|
----------
|
|
Epoch: 216/300
|
|
train loss: 0.5524102382180167
|
|
----------
|
|
Epoch: 217/300
|
|
train loss: 0.5583272420960229
|
|
----------
|
|
Epoch: 218/300
|
|
train loss: 0.5493520412866663
|
|
----------
|
|
Epoch: 219/300
|
|
train loss: 0.5505585817665588
|
|
----------
|
|
Epoch: 220/300
|
|
train loss: 0.5513288827567566
|
|
----------
|
|
Epoch: 221/300
|
|
train loss: 0.5509700740619403
|
|
----------
|
|
Epoch: 222/300
|
|
train loss: 0.5471859193611436
|
|
----------
|
|
Epoch: 223/300
|
|
train loss: 0.5479258945802363
|
|
----------
|
|
Epoch: 224/300
|
|
train loss: 0.5468155258312458
|
|
----------
|
|
Epoch: 225/300
|
|
train loss: 0.5532922947370424
|
|
----------
|
|
Epoch: 226/300
|
|
train loss: 0.5509961512757511
|
|
----------
|
|
Epoch: 227/300
|
|
train loss: 0.5566592513606315
|
|
----------
|
|
Epoch: 228/300
|
|
train loss: 0.5468720855509362
|
|
----------
|
|
Epoch: 229/300
|
|
train loss: 0.5485118084200998
|
|
----------
|
|
Epoch: 230/300
|
|
train loss: 0.5553761393558688
|
|
----------
|
|
Epoch: 231/300
|
|
train loss: 0.5538774156352368
|
|
----------
|
|
Epoch: 232/300
|
|
train loss: 0.5535862693881116
|
|
----------
|
|
Epoch: 233/300
|
|
train loss: 0.559648928482358
|
|
----------
|
|
Epoch: 234/300
|
|
train loss: 0.5566149992732013
|
|
----------
|
|
Epoch: 235/300
|
|
train loss: 0.5537716771771268
|
|
----------
|
|
Epoch: 236/300
|
|
train loss: 0.5482096226840485
|
|
----------
|
|
Epoch: 237/300
|
|
train loss: 0.5522753771667074
|
|
----------
|
|
Epoch: 238/300
|
|
train loss: 0.553921124738891
|
|
----------
|
|
Epoch: 239/300
|
|
train loss: 0.5568602827445763
|
|
----------
|
|
Epoch: 240/300
|
|
train loss: 0.5509852958706821
|
|
----------
|
|
Epoch: 241/300
|
|
train loss: 0.5503496584914079
|
|
----------
|
|
Epoch: 242/300
|
|
train loss: 0.5573272326188844
|
|
----------
|
|
Epoch: 243/300
|
|
train loss: 0.5485873044264026
|
|
----------
|
|
Epoch: 244/300
|
|
train loss: 0.5551095642149448
|
|
----------
|
|
Epoch: 245/300
|
|
train loss: 0.555719885429958
|
|
----------
|
|
Epoch: 246/300
|
|
train loss: 0.5547739392737063
|
|
----------
|
|
Epoch: 247/300
|
|
train loss: 0.5579677987389449
|
|
----------
|
|
Epoch: 248/300
|
|
train loss: 0.5550545382245284
|
|
----------
|
|
Epoch: 249/300
|
|
train loss: 0.5540565562139197
|
|
----------
|
|
Epoch: 250/300
|
|
train loss: 0.5495907050989023
|
|
----------
|
|
Epoch: 251/300
|
|
train loss: 0.5591612574530811
|
|
----------
|
|
Epoch: 252/300
|
|
train loss: 0.5526787800396361
|
|
----------
|
|
Epoch: 253/300
|
|
train loss: 0.5496814871524892
|
|
----------
|
|
Epoch: 254/300
|
|
train loss: 0.5540165307863456
|
|
----------
|
|
Epoch: 255/300
|
|
train loss: 0.5507408661268106
|
|
----------
|
|
Epoch: 256/300
|
|
train loss: 0.5493098404712793
|
|
----------
|
|
Epoch: 257/300
|
|
train loss: 0.5516996947912182
|
|
----------
|
|
Epoch: 258/300
|
|
train loss: 0.5472092830189844
|
|
----------
|
|
Epoch: 259/300
|
|
train loss: 0.5510950345636868
|
|
----------
|
|
Epoch: 260/300
|
|
train loss: 0.5474240792597213
|
|
----------
|
|
Epoch: 261/300
|
|
train loss: 0.5587661537091907
|
|
----------
|
|
Epoch: 262/300
|
|
train loss: 0.5486958720153425
|
|
----------
|
|
Epoch: 263/300
|
|
train loss: 0.555758919476009
|
|
----------
|
|
Epoch: 264/300
|
|
train loss: 0.5515954894263569
|
|
----------
|
|
Epoch: 265/300
|
|
train loss: 0.5496930155630518
|
|
----------
|
|
Epoch: 266/300
|
|
train loss: 0.5465638443100743
|
|
----------
|
|
Epoch: 267/300
|
|
train loss: 0.5538034867040995
|
|
----------
|
|
Epoch: 268/300
|
|
train loss: 0.5515833862307595
|
|
----------
|
|
Epoch: 269/300
|
|
train loss: 0.5496920899647039
|
|
----------
|
|
Epoch: 270/300
|
|
train loss: 0.550315346841405
|
|
----------
|
|
Epoch: 271/300
|
|
train loss: 0.5523107038220254
|
|
----------
|
|
Epoch: 272/300
|
|
train loss: 0.5502457329776229
|
|
----------
|
|
Epoch: 273/300
|
|
train loss: 0.5546982049396852
|
|
----------
|
|
Epoch: 274/300
|
|
train loss: 0.5525404347515688
|
|
----------
|
|
Epoch: 275/300
|
|
train loss: 0.5541610901312131
|
|
----------
|
|
Epoch: 276/300
|
|
train loss: 0.5573831272379655
|
|
----------
|
|
Epoch: 277/300
|
|
train loss: 0.549906870759115
|
|
----------
|
|
Epoch: 278/300
|
|
train loss: 0.5567150217730824
|
|
----------
|
|
Epoch: 279/300
|
|
train loss: 0.5559126037831713
|
|
----------
|
|
Epoch: 280/300
|
|
train loss: 0.5517111808606764
|
|
----------
|
|
Epoch: 281/300
|
|
train loss: 0.552947857757894
|
|
----------
|
|
Epoch: 282/300
|
|
train loss: 0.5567048276524719
|
|
----------
|
|
Epoch: 283/300
|
|
train loss: 0.5521830897323969
|
|
----------
|
|
Epoch: 284/300
|
|
train loss: 0.5454606041312218
|
|
----------
|
|
Epoch: 285/300
|
|
train loss: 0.5507395709978371
|
|
----------
|
|
Epoch: 286/300
|
|
train loss: 0.5518143521394672
|
|
----------
|
|
Epoch: 287/300
|
|
train loss: 0.5519008437489591
|
|
----------
|
|
Epoch: 288/300
|
|
train loss: 0.5451099986165036
|
|
----------
|
|
Epoch: 289/300
|
|
train loss: 0.5455087039892267
|
|
----------
|
|
Epoch: 290/300
|
|
train loss: 0.545345228528831
|
|
----------
|
|
Epoch: 291/300
|
|
train loss: 0.5585307433474355
|
|
----------
|
|
Epoch: 292/300
|
|
train loss: 0.5500989946105131
|
|
----------
|
|
Epoch: 293/300
|
|
train loss: 0.5457923505000952
|
|
----------
|
|
Epoch: 294/300
|
|
train loss: 0.5575511342868572
|
|
----------
|
|
Epoch: 295/300
|
|
train loss: 0.5508520018036772
|
|
----------
|
|
Epoch: 296/300
|
|
train loss: 0.5536141810802425
|
|
----------
|
|
Epoch: 297/300
|
|
train loss: 0.5538290936227251
|
|
----------
|
|
Epoch: 298/300
|
|
train loss: 0.5453163091911049
|
|
----------
|
|
Epoch: 299/300
|
|
train loss: 0.5550332332893115
|
|
----------
|
|
Epoch: 300/300
|
|
train loss: 0.5472247938557369
|
|
Traceback (most recent call last):
|
|
File "main.py", line 83, in <module>
|
|
print('程序总运行时间:%s毫秒' % ((T2 - T1) * 1000))
|
|
NameError: name 'T1' is not defined
|