398 lines
7.9 KiB
Plaintext
398 lines
7.9 KiB
Plaintext
nohup: ignoring input
|
|
Setting up data...
|
|
Starting training...
|
|
----------
|
|
Epoch: 1/300
|
|
train loss: 3.307037867423965
|
|
----------
|
|
Epoch: 2/300
|
|
/home/thsw/anaconda3/envs/yolov5_bridge/lib/python3.8/site-packages/torch/optim/lr_scheduler.py:154: UserWarning: The epoch parameter in `scheduler.step()` was not necessary and is being deprecated where possible. Please use `scheduler.step()` to step the scheduler. During the deprecation, if epoch is different from None, the closed form is used instead of the new chainable form, where available. Please open an issue if you are unable to replicate your use case: https://github.com/pytorch/pytorch/issues/new/choose.
|
|
warnings.warn(EPOCH_DEPRECATION_WARNING, UserWarning)
|
|
train loss: 1.8490978410331214
|
|
----------
|
|
Epoch: 3/300
|
|
train loss: 1.561141774058342
|
|
----------
|
|
Epoch: 4/300
|
|
train loss: 1.4109461198492748
|
|
----------
|
|
Epoch: 5/300
|
|
train loss: 1.3152732482043707
|
|
----------
|
|
Epoch: 6/300
|
|
train loss: 1.23120613178102
|
|
----------
|
|
Epoch: 7/300
|
|
train loss: 1.1735509862987006
|
|
----------
|
|
Epoch: 8/300
|
|
train loss: 1.1067496543613875
|
|
----------
|
|
Epoch: 9/300
|
|
train loss: 1.0879325459643108
|
|
----------
|
|
Epoch: 10/300
|
|
train loss: 1.0459921903363087
|
|
----------
|
|
Epoch: 11/300
|
|
train loss: 1.0326051524863011
|
|
----------
|
|
Epoch: 12/300
|
|
train loss: 0.9921930325103969
|
|
----------
|
|
Epoch: 13/300
|
|
train loss: 0.9749882384771253
|
|
----------
|
|
Epoch: 14/300
|
|
train loss: 0.9526009223446613
|
|
----------
|
|
Epoch: 15/300
|
|
train loss: 0.924446231344851
|
|
----------
|
|
Epoch: 16/300
|
|
train loss: 0.9120315565932088
|
|
----------
|
|
Epoch: 17/300
|
|
train loss: 0.8949574039476674
|
|
----------
|
|
Epoch: 18/300
|
|
train loss: 0.8795958457560074
|
|
----------
|
|
Epoch: 19/300
|
|
train loss: 0.8690951808196742
|
|
----------
|
|
Epoch: 20/300
|
|
train loss: 0.839984535625795
|
|
----------
|
|
Epoch: 21/300
|
|
train loss: 0.8357012524473958
|
|
----------
|
|
Epoch: 22/300
|
|
train loss: 0.8244846405052557
|
|
----------
|
|
Epoch: 23/300
|
|
train loss: 0.8116382245973843
|
|
----------
|
|
Epoch: 24/300
|
|
train loss: 0.784335666676847
|
|
----------
|
|
Epoch: 25/300
|
|
train loss: 0.7887035886325487
|
|
----------
|
|
Epoch: 26/300
|
|
train loss: 0.7719192650259995
|
|
----------
|
|
Epoch: 27/300
|
|
train loss: 0.7664937497820796
|
|
----------
|
|
Epoch: 28/300
|
|
train loss: 0.7657128625163218
|
|
----------
|
|
Epoch: 29/300
|
|
train loss: 0.7466670109367952
|
|
----------
|
|
Epoch: 30/300
|
|
train loss: 0.7491191251248848
|
|
----------
|
|
Epoch: 31/300
|
|
train loss: 0.7392067185989241
|
|
----------
|
|
Epoch: 32/300
|
|
train loss: 0.7254988480268455
|
|
----------
|
|
Epoch: 33/300
|
|
train loss: 0.72247884095442
|
|
----------
|
|
Epoch: 34/300
|
|
train loss: 0.7225755510352007
|
|
----------
|
|
Epoch: 35/300
|
|
train loss: 0.7113728040783871
|
|
----------
|
|
Epoch: 36/300
|
|
train loss: 0.6882840236876069
|
|
----------
|
|
Epoch: 37/300
|
|
train loss: 0.6928801804599238
|
|
----------
|
|
Epoch: 38/300
|
|
train loss: 0.6762253627544497
|
|
----------
|
|
Epoch: 39/300
|
|
train loss: 0.676805709375114
|
|
----------
|
|
Epoch: 40/300
|
|
train loss: 0.6736773376057787
|
|
----------
|
|
Epoch: 41/300
|
|
train loss: 0.6719661420438348
|
|
----------
|
|
Epoch: 42/300
|
|
train loss: 0.667674303599974
|
|
----------
|
|
Epoch: 43/300
|
|
train loss: 0.6575799101009602
|
|
----------
|
|
Epoch: 44/300
|
|
train loss: 0.6586109829566827
|
|
----------
|
|
Epoch: 45/300
|
|
train loss: 0.6569638466689645
|
|
----------
|
|
Epoch: 46/300
|
|
train loss: 0.646054647953772
|
|
----------
|
|
Epoch: 47/300
|
|
train loss: 0.6500433959612032
|
|
----------
|
|
Epoch: 48/300
|
|
train loss: 0.6366679424919733
|
|
----------
|
|
Epoch: 49/300
|
|
train loss: 0.6255944731031976
|
|
----------
|
|
Epoch: 50/300
|
|
train loss: 0.6329708500061093
|
|
----------
|
|
Epoch: 51/300
|
|
train loss: 0.6252293430450486
|
|
----------
|
|
Epoch: 52/300
|
|
train loss: 0.635402276112539
|
|
----------
|
|
Epoch: 53/300
|
|
train loss: 0.6177184131087327
|
|
----------
|
|
Epoch: 54/300
|
|
train loss: 0.6235615758270752
|
|
----------
|
|
Epoch: 55/300
|
|
train loss: 0.6224451232247237
|
|
----------
|
|
Epoch: 56/300
|
|
train loss: 0.610399562047749
|
|
----------
|
|
Epoch: 57/300
|
|
train loss: 0.6147097092030979
|
|
----------
|
|
Epoch: 58/300
|
|
train loss: 0.6103556625908468
|
|
----------
|
|
Epoch: 59/300
|
|
train loss: 0.6093381568789482
|
|
----------
|
|
Epoch: 60/300
|
|
train loss: 0.5974853209606031
|
|
----------
|
|
Epoch: 61/300
|
|
train loss: 0.607551729715452
|
|
----------
|
|
Epoch: 62/300
|
|
train loss: 0.6001740366038752
|
|
----------
|
|
Epoch: 63/300
|
|
train loss: 0.5975715004634566
|
|
----------
|
|
Epoch: 64/300
|
|
train loss: 0.6036988319601955
|
|
----------
|
|
Epoch: 65/300
|
|
train loss: 0.6021600912620382
|
|
----------
|
|
Epoch: 66/300
|
|
train loss: 0.5951395198339369
|
|
----------
|
|
Epoch: 67/300
|
|
train loss: 0.5903787380311547
|
|
----------
|
|
Epoch: 68/300
|
|
train loss: 0.5812304458785348
|
|
----------
|
|
Epoch: 69/300
|
|
train loss: 0.5848958731242796
|
|
----------
|
|
Epoch: 70/300
|
|
train loss: 0.5830151068909866
|
|
----------
|
|
Epoch: 71/300
|
|
train loss: 0.5838195401720885
|
|
----------
|
|
Epoch: 72/300
|
|
train loss: 0.5782773125825859
|
|
----------
|
|
Epoch: 73/300
|
|
train loss: 0.5841681860932489
|
|
----------
|
|
Epoch: 74/300
|
|
train loss: 0.5878641362779025
|
|
----------
|
|
Epoch: 75/300
|
|
train loss: 0.5744512092049529
|
|
----------
|
|
Epoch: 76/300
|
|
train loss: 0.5784077225480138
|
|
----------
|
|
Epoch: 77/300
|
|
train loss: 0.5778905999188017
|
|
----------
|
|
Epoch: 78/300
|
|
train loss: 0.5806078254813101
|
|
----------
|
|
Epoch: 79/300
|
|
train loss: 0.5736329948938474
|
|
----------
|
|
Epoch: 80/300
|
|
train loss: 0.577592335005359
|
|
----------
|
|
Epoch: 81/300
|
|
train loss: 0.5712515053044005
|
|
----------
|
|
Epoch: 82/300
|
|
train loss: 0.5629513234626956
|
|
----------
|
|
Epoch: 83/300
|
|
train loss: 0.5674999619765979
|
|
----------
|
|
Epoch: 84/300
|
|
train loss: 0.572166397168142
|
|
----------
|
|
Epoch: 85/300
|
|
train loss: 0.5600198752632956
|
|
----------
|
|
Epoch: 86/300
|
|
train loss: 0.5641529346566375
|
|
----------
|
|
Epoch: 87/300
|
|
train loss: 0.5696737862578253
|
|
----------
|
|
Epoch: 88/300
|
|
train loss: 0.5660816598229292
|
|
----------
|
|
Epoch: 89/300
|
|
train loss: 0.5631480350545267
|
|
----------
|
|
Epoch: 90/300
|
|
train loss: 0.557942715087315
|
|
----------
|
|
Epoch: 91/300
|
|
train loss: 0.5627899840474129
|
|
----------
|
|
Epoch: 92/300
|
|
train loss: 0.5633124596462017
|
|
----------
|
|
Epoch: 93/300
|
|
train loss: 0.5645929554977068
|
|
----------
|
|
Epoch: 94/300
|
|
train loss: 0.5542543557722394
|
|
----------
|
|
Epoch: 95/300
|
|
train loss: 0.5531571798208283
|
|
----------
|
|
Epoch: 96/300
|
|
train loss: 0.557213164048224
|
|
----------
|
|
Epoch: 97/300
|
|
train loss: 0.5653773972355738
|
|
----------
|
|
Epoch: 98/300
|
|
train loss: 0.557706034038125
|
|
----------
|
|
Epoch: 99/300
|
|
train loss: 0.556364589348072
|
|
----------
|
|
Epoch: 100/300
|
|
train loss: 0.5555244980425369
|
|
----------
|
|
Epoch: 101/300
|
|
train loss: 0.5536440839854683
|
|
----------
|
|
Epoch: 102/300
|
|
train loss: 0.5535611885531646
|
|
----------
|
|
Epoch: 103/300
|
|
train loss: 0.5573019144556871
|
|
----------
|
|
Epoch: 104/300
|
|
train loss: 0.5536222801339336
|
|
----------
|
|
Epoch: 105/300
|
|
train loss: 0.5522710204851337
|
|
----------
|
|
Epoch: 106/300
|
|
train loss: 0.5528217791238936
|
|
----------
|
|
Epoch: 107/300
|
|
train loss: 0.5496333808433719
|
|
----------
|
|
Epoch: 108/300
|
|
train loss: 0.5561074961612864
|
|
----------
|
|
Epoch: 109/300
|
|
train loss: 0.5574712738758181
|
|
----------
|
|
Epoch: 110/300
|
|
train loss: 0.5491735800737287
|
|
----------
|
|
Epoch: 111/300
|
|
train loss: 0.5571781733050579
|
|
----------
|
|
Epoch: 112/300
|
|
train loss: 0.555856435309823
|
|
----------
|
|
Epoch: 113/300
|
|
train loss: 0.5511427012885489
|
|
----------
|
|
Epoch: 114/300
|
|
train loss: 0.5534139981538784
|
|
----------
|
|
Epoch: 115/300
|
|
train loss: 0.5492776229795886
|
|
----------
|
|
Epoch: 116/300
|
|
train loss: 0.5522508076051387
|
|
----------
|
|
Epoch: 117/300
|
|
train loss: 0.5524138775540561
|
|
----------
|
|
Epoch: 118/300
|
|
train loss: 0.5566331935001583
|
|
----------
|
|
Epoch: 119/300
|
|
train loss: 0.5521774955275582
|
|
----------
|
|
Epoch: 120/300
|
|
train loss: 0.5556204090939789
|
|
----------
|
|
Epoch: 121/300
|
|
train loss: 0.5481414351521469
|
|
----------
|
|
Epoch: 122/300
|
|
train loss: 0.5481620049331246
|
|
----------
|
|
Epoch: 123/300
|
|
train loss: 0.5498910303704623
|
|
----------
|
|
Epoch: 124/300
|
|
train loss: 0.5474408364332304
|
|
----------
|
|
Epoch: 125/300
|
|
train loss: 0.5461990730609836
|
|
----------
|
|
Epoch: 126/300
|
|
train loss: 0.5499345149572302
|
|
----------
|
|
Epoch: 127/300
|
|
train loss: 0.5495844648742094
|
|
----------
|
|
Epoch: 128/300
|
|
train loss: 0.5491425968343164
|
|
----------
|
|
Epoch: 129/300
|
|
train loss: 0.550200116525336
|
|
----------
|
|
Epoch: 130/300
|
|
train loss: 0.5534766463608276
|
|
----------
|
|
Epoch: 131/300
|