nohup: ignoring input Setting up data... Starting training... ---------- Epoch: 1/300 train loss: 3.4394320607554434 ---------- Epoch: 2/300 /home/thsw/anaconda3/envs/yolov5_bridge/lib/python3.8/site-packages/torch/optim/lr_scheduler.py:154: UserWarning: The epoch parameter in `scheduler.step()` was not necessary and is being deprecated where possible. Please use `scheduler.step()` to step the scheduler. During the deprecation, if epoch is different from None, the closed form is used instead of the new chainable form, where available. Please open an issue if you are unable to replicate your use case: https://github.com/pytorch/pytorch/issues/new/choose. warnings.warn(EPOCH_DEPRECATION_WARNING, UserWarning) train loss: 1.930078529721074 ---------- Epoch: 3/300 train loss: 1.633405933439178 ---------- Epoch: 4/300 train loss: 1.4825877442079431 ---------- Epoch: 5/300 train loss: 1.3513453146990608 ---------- Epoch: 6/300 train loss: 1.284238253584587 ---------- Epoch: 7/300 train loss: 1.1909992801884748 ---------- Epoch: 8/300 train loss: 1.1763500916330438 ---------- Epoch: 9/300 train loss: 1.1247926802827108 ---------- Epoch: 10/300 train loss: 1.0886840263018298 ---------- Epoch: 11/300 train loss: 1.0634160527134828 ---------- Epoch: 12/300 train loss: 1.031403729605601 ---------- Epoch: 13/300 train loss: 1.0176705832451858 ---------- Epoch: 14/300 train loss: 0.9851431567971551 ---------- Epoch: 15/300 train loss: 0.9842541802778333 ---------- Epoch: 16/300 train loss: 0.9521809052387628 ---------- Epoch: 17/300 train loss: 0.9335779822647756 ---------- Epoch: 18/300 train loss: 0.9177891434899794 ---------- Epoch: 19/300 train loss: 0.89844102848425 ---------- Epoch: 20/300 train loss: 0.8756160040388904 ---------- Epoch: 21/300 train loss: 0.867238417313933 ---------- Epoch: 22/300 train loss: 0.8499307780073893 ---------- Epoch: 23/300 train loss: 0.842164253302772 ---------- Epoch: 24/300 train loss: 0.8305777697002187 ---------- Epoch: 25/300 train loss: 0.8163966628789163 ---------- Epoch: 26/300 train loss: 0.8109905509387746 ---------- Epoch: 27/300 train loss: 0.8094634806777671 ---------- Epoch: 28/300 train loss: 0.7870301117099845 ---------- Epoch: 29/300 train loss: 0.7851962270935992 ---------- Epoch: 30/300 train loss: 0.7699445513748902 ---------- Epoch: 31/300 train loss: 0.7692669129962153 ---------- Epoch: 32/300 train loss: 0.7567083329977266 ---------- Epoch: 33/300 train loss: 0.7474709262050712 ---------- Epoch: 34/300 train loss: 0.748934083678774 ---------- Epoch: 35/300 train loss: 0.7352799732618657 ---------- Epoch: 36/300 train loss: 0.7257756930386688 ---------- Epoch: 37/300 train loss: 0.7222192510731819 ---------- Epoch: 38/300 train loss: 0.7172380039935511 ---------- Epoch: 39/300 train loss: 0.7211404704826166 ---------- Epoch: 40/300 train loss: 0.7054665542608444 ---------- Epoch: 41/300 train loss: 0.6915910904621562 ---------- Epoch: 42/300 train loss: 0.6871611627262812 ---------- Epoch: 43/300 train loss: 0.688079390178893 ---------- Epoch: 44/300 train loss: 0.6753499621392772 ---------- Epoch: 45/300 train loss: 0.6777258363117006 ---------- Epoch: 46/300 train loss: 0.6820033273829764 ---------- Epoch: 47/300 train loss: 0.6689407763650912 ---------- Epoch: 48/300 train loss: 0.6597786256034308 ---------- Epoch: 49/300 train loss: 0.6596399033586308 ---------- Epoch: 50/300 train loss: 0.6629376999180383 ---------- Epoch: 51/300 train loss: 0.6641064552884353 ---------- Epoch: 52/300 train loss: 0.6577023321260977 ---------- Epoch: 53/300 train loss: 0.6460512635693092 ---------- Epoch: 54/300 train loss: 0.6521802763267198 ---------- Epoch: 55/300 train loss: 0.636500123853654 ---------- Epoch: 56/300 train loss: 0.6382356001865753 ---------- Epoch: 57/300 train loss: 0.6377618587792104 ---------- Epoch: 58/300 train loss: 0.6327137644445933 ---------- Epoch: 59/300 train loss: 0.6347541431708971 ---------- Epoch: 60/300 train loss: 0.629689353722906 ---------- Epoch: 61/300 train loss: 0.629009592182496 ---------- Epoch: 62/300 train loss: 0.6273051674705541 ---------- Epoch: 63/300 train loss: 0.6193134883424446 ---------- Epoch: 64/300 train loss: 0.6325272070734125 ---------- Epoch: 65/300 train loss: 0.6154070925601864 ---------- Epoch: 66/300 train loss: 0.6071137105901913 ---------- Epoch: 67/300 train loss: 0.6189665371788544 ---------- Epoch: 68/300 train loss: 0.6039990106418774 ---------- Epoch: 69/300 train loss: 0.6081831483833561 ---------- Epoch: 70/300 train loss: 0.6164965122107751 ---------- Epoch: 71/300 train loss: 0.6093785233172839 ---------- Epoch: 72/300 train loss: 0.6064768053429783 ---------- Epoch: 73/300 train loss: 0.6042170901047555 ---------- Epoch: 74/300 train loss: 0.607218999980773 ---------- Epoch: 75/300 train loss: 0.6010280915827205 ---------- Epoch: 76/300 train loss: 0.6088563133688534 ---------- Epoch: 77/300 train loss: 0.5960819613269239 ---------- Epoch: 78/300 train loss: 0.5995096006076033 ---------- Epoch: 79/300 train loss: 0.6007698672843791 ---------- Epoch: 80/300 train loss: 0.5952687222891179 ---------- Epoch: 81/300 train loss: 0.603035535513432 ---------- Epoch: 82/300 train loss: 0.5906850624564263 ---------- Epoch: 83/300 train loss: 0.5875769174504945 ---------- Epoch: 84/300 train loss: 0.5973145653588853 ---------- Epoch: 85/300 train loss: 0.5833648518881193 ---------- Epoch: 86/300 train loss: 0.5883274316418651 ---------- Epoch: 87/300 train loss: 0.5961968240907687 ---------- Epoch: 88/300 train loss: 0.5970893593579992 ---------- Epoch: 89/300 train loss: 0.5811341264668632 ---------- Epoch: 90/300 train loss: 0.5812809775488296 ---------- Epoch: 91/300 train loss: 0.5839183432029866 ---------- Epoch: 92/300 train loss: 0.5849279689345935 ---------- Epoch: 93/300 train loss: 0.5826380343260041 ---------- Epoch: 94/300 train loss: 0.5840902419835791 ---------- Epoch: 95/300 train loss: 0.5825643528356641 ---------- Epoch: 96/300 train loss: 0.5778747019199395 ---------- Epoch: 97/300 train loss: 0.5842624517048106 ---------- Epoch: 98/300 train loss: 0.5786413823487958 ---------- Epoch: 99/300 train loss: 0.5854276273826328 ---------- Epoch: 100/300 train loss: 0.5831200461084998 ---------- Epoch: 101/300 train loss: 0.5781117375052012 ---------- Epoch: 102/300 train loss: 0.5853699297174212 ---------- Epoch: 103/300 train loss: 0.5790560357157291 ---------- Epoch: 104/300 train loss: 0.5730563790621034 ---------- Epoch: 105/300 train loss: 0.5763303658726046 ---------- Epoch: 106/300 train loss: 0.5797402464752965 ---------- Epoch: 107/300 train loss: 0.5812801884792906 ---------- Epoch: 108/300 train loss: 0.5763228203490054 ---------- Epoch: 109/300 train loss: 0.5762089939309347 ---------- Epoch: 110/300 train loss: 0.5756880531185552 ---------- Epoch: 111/300 train loss: 0.5692944096706969 ---------- Epoch: 112/300 train loss: 0.5768693202973888 ---------- Epoch: 113/300 train loss: 0.5789581728239915 ---------- Epoch: 114/300 train loss: 0.5717335509441954 ---------- Epoch: 115/300 train loss: 0.5798303593792045 ---------- Epoch: 116/300 train loss: 0.5730079361713338 ---------- Epoch: 117/300 train loss: 0.5750909528865165 ---------- Epoch: 118/300 train loss: 0.5771061666425168 ---------- Epoch: 119/300 train loss: 0.5824603371944959 ---------- Epoch: 120/300 train loss: 0.5709850619267384 ---------- Epoch: 121/300 train loss: 0.5749542965054881 ---------- Epoch: 122/300 train loss: 0.5685901321678338 ---------- Epoch: 123/300 train loss: 0.5765443899129566 ---------- Epoch: 124/300 train loss: 0.5766789385408809 ---------- Epoch: 125/300 train loss: 0.575109283898268 ---------- Epoch: 126/300 train loss: 0.5734074563249346 ---------- Epoch: 127/300 train loss: 0.5787333538842275 ---------- Epoch: 128/300 train loss: 0.5745735660241484 ---------- Epoch: 129/300 train loss: 0.5669669184153294 ---------- Epoch: 130/300 train loss: 0.5708108556344413 ---------- Epoch: 131/300 train loss: 0.5693512706564676 ---------- Epoch: 132/300 train loss: 0.569030460742974 ---------- Epoch: 133/300 train loss: 0.5710507141362772 ---------- Epoch: 134/300 train loss: 0.5748469328548148 ---------- Epoch: 135/300 train loss: 0.5773823523299982 ---------- Epoch: 136/300 train loss: 0.5693722617330935 ---------- Epoch: 137/300 train loss: 0.567474852220931 ---------- Epoch: 138/300 train loss: 0.5759126903841001 ---------- Epoch: 139/300 train loss: 0.5726324898355147 ---------- Epoch: 140/300 train loss: 0.5715209749460959 ---------- Epoch: 141/300 train loss: 0.5773299010730011 ---------- Epoch: 142/300 train loss: 0.566016223962093 ---------- Epoch: 143/300 train loss: 0.5756766654943165 ---------- Epoch: 144/300 train loss: 0.5762189758635896 ---------- Epoch: 145/300 train loss: 0.5703497758775303 ---------- Epoch: 146/300 train loss: 0.5751561454206059 ---------- Epoch: 147/300 train loss: 0.5723920575236389 ---------- Epoch: 148/300 train loss: 0.5694741011588567 ---------- Epoch: 149/300 train loss: 0.5725228538269598 ---------- Epoch: 150/300 train loss: 0.5694768740665802 ---------- Epoch: 151/300 train loss: 0.5667742189054519 ---------- Epoch: 152/300 train loss: 0.5780753946341228 ---------- Epoch: 153/300 train loss: 0.5716342858670297 ---------- Epoch: 154/300 train loss: 0.5683843594032914 ---------- Epoch: 155/300 train loss: 0.5782360846044109 ---------- Epoch: 156/300 train loss: 0.5695936934497703 ---------- Epoch: 157/300 train loss: 0.5644589573975318 ---------- Epoch: 158/300 train loss: 0.5699425021376772 ---------- Epoch: 159/300 train loss: 0.5675014005541432 ---------- Epoch: 160/300 train loss: 0.5726423774710381 ---------- Epoch: 161/300 train loss: 0.5736665025399565 ---------- Epoch: 162/300 train loss: 0.5683291724222732 ---------- Epoch: 163/300 train loss: 0.5703395354120355 ---------- Epoch: 164/300 train loss: 0.5666843002241093 ---------- Epoch: 165/300 train loss: 0.5642050609869116 ---------- Epoch: 166/300 train loss: 0.5783111298047352 ---------- Epoch: 167/300 train loss: 0.5688314526442773 ---------- Epoch: 168/300 train loss: 0.5649547326860044 ---------- Epoch: 169/300 train loss: 0.5662805891996566 ---------- Epoch: 170/300 train loss: 0.5690607213568023 ---------- Epoch: 171/300 train loss: 0.5693840228551682 ---------- Epoch: 172/300 train loss: 0.5664498788468978 ---------- Epoch: 173/300 train loss: 0.5662287851974322 ---------- Epoch: 174/300 train loss: 0.5706901245811037 ---------- Epoch: 175/300 train loss: 0.5746730549416675 ---------- Epoch: 176/300 train loss: 0.5749756673541231 ---------- Epoch: 177/300 train loss: 0.5743117327845133 ---------- Epoch: 178/300 train loss: 0.5589386003305299 ---------- Epoch: 179/300 train loss: 0.5710129630823991 ---------- Epoch: 180/300 train loss: 0.5704299873058272 ---------- Epoch: 181/300 train loss: 0.5674120885299825 ---------- Epoch: 182/300 train loss: 0.559411147338318 ---------- Epoch: 183/300 train loss: 0.5742758797602757 ---------- Epoch: 184/300 train loss: 0.5652756291461802 ---------- Epoch: 185/300 train loss: 0.5745003632163116 ---------- Epoch: 186/300 train loss: 0.5711110114313132 ---------- Epoch: 187/300 train loss: 0.5659535459874215 ---------- Epoch: 188/300 train loss: 0.574384407823669 ---------- Epoch: 189/300 train loss: 0.5667811529739722 ---------- Epoch: 190/300 train loss: 0.5642454672154996 ---------- Epoch: 191/300 train loss: 0.5740268274726513 ---------- Epoch: 192/300 train loss: 0.5629734715256529 ---------- Epoch: 193/300 train loss: 0.5723785064168759 ---------- Epoch: 194/300 train loss: 0.5708345717320871 ---------- Epoch: 195/300 train loss: 0.5627649647901671 ---------- Epoch: 196/300 train loss: 0.5750793810045756 ---------- Epoch: 197/300 train loss: 0.5741033718312857 ---------- Epoch: 198/300 train loss: 0.5778550891315236 ---------- Epoch: 199/300 train loss: 0.5620668767591009 ---------- Epoch: 200/300 train loss: 0.5723251026850367 ---------- Epoch: 201/300 train loss: 0.5726388204762072 ---------- Epoch: 202/300 train loss: 0.5720572968016467 ---------- Epoch: 203/300 train loss: 0.5737994054891745 ---------- Epoch: 204/300 train loss: 0.5726441243669197 ---------- Epoch: 205/300 train loss: 0.5654018422582939 ---------- Epoch: 206/300 train loss: 0.5721300107775827 ---------- Epoch: 207/300 train loss: 0.5674502250764392 ---------- Epoch: 208/300 train loss: 0.5661794402097401 ---------- Epoch: 209/300 train loss: 0.565577471496151 ---------- Epoch: 210/300 train loss: 0.5717732342588643 ---------- Epoch: 211/300 train loss: 0.5689131697818591 ---------- Epoch: 212/300 train loss: 0.5645846638701648 ---------- Epoch: 213/300 train loss: 0.5738475428830728 ---------- Epoch: 214/300 train loss: 0.5720115142525534 ---------- Epoch: 215/300 train loss: 0.5752112238953357 ---------- Epoch: 216/300 train loss: 0.5772492190817192 ---------- Epoch: 217/300 train loss: 0.5704983932315012 ---------- Epoch: 218/300 train loss: 0.569904790205114 ---------- Epoch: 219/300 train loss: 0.5742690336224464 ---------- Epoch: 220/300 train loss: 0.568482704575955 ---------- Epoch: 221/300 train loss: 0.5715297753965891 ---------- Epoch: 222/300 train loss: 0.5710316764496428 ---------- Epoch: 223/300 train loss: 0.5715734553595445 ---------- Epoch: 224/300 train loss: 0.5672295156278109 ---------- Epoch: 225/300 train loss: 0.5646657074568072 ---------- Epoch: 226/300 train loss: 0.5673439821960756 ---------- Epoch: 227/300 train loss: 0.5700345693542492 ---------- Epoch: 228/300 train loss: 0.5602112738156098 ---------- Epoch: 229/300 train loss: 0.5691522084891611 ---------- Epoch: 230/300 train loss: 0.5622219814235581 ---------- Epoch: 231/300 train loss: 0.5705981853392103 ---------- Epoch: 232/300 train loss: 0.5717427863425145 ---------- Epoch: 233/300 train loss: 0.566420454358907 ---------- Epoch: 234/300 train loss: 0.5703431164886191 ---------- Epoch: 235/300 train loss: 0.5664396363515234 ---------- Epoch: 236/300 train loss: 0.5644527944987034 ---------- Epoch: 237/300 train loss: 0.5765276849269867 ---------- Epoch: 238/300 train loss: 0.5697608117717707 ---------- Epoch: 239/300 train loss: 0.5707514197464698 ---------- Epoch: 240/300 train loss: 0.568966758694073 ---------- Epoch: 241/300 train loss: 0.5714757817079408 ---------- Epoch: 242/300 train loss: 0.5690984512814796 ---------- Epoch: 243/300 train loss: 0.5704802971321732 ---------- Epoch: 244/300 train loss: 0.5740080759990326 ---------- Epoch: 245/300 train loss: 0.5745089057798356 ---------- Epoch: 246/300 train loss: 0.5653779470699122 ---------- Epoch: 247/300 train loss: 0.5691389676771665 ---------- Epoch: 248/300 train loss: 0.5720810018272223 ---------- Epoch: 249/300 train loss: 0.5658170342814443 ---------- Epoch: 250/300 train loss: 0.5725755191439814 ---------- Epoch: 251/300 train loss: 0.5666882629180471 ---------- Epoch: 252/300 train loss: 0.564656544119212 ---------- Epoch: 253/300 train loss: 0.5677354193693344 ---------- Epoch: 254/300 train loss: 0.5748839044349482 ---------- Epoch: 255/300 train loss: 0.5665894175646106 ---------- Epoch: 256/300 train loss: 0.567643519341023 ---------- Epoch: 257/300 train loss: 0.5634990612235231 ---------- Epoch: 258/300 train loss: 0.5685394195026657 ---------- Epoch: 259/300 train loss: 0.5634476017472175 ---------- Epoch: 260/300 train loss: 0.5658026995489103 ---------- Epoch: 261/300 train loss: 0.5762477362119007 ---------- Epoch: 262/300 train loss: 0.5678475147614908 ---------- Epoch: 263/300 train loss: 0.569275289773941 ---------- Epoch: 264/300 train loss: 0.5606244695444963 ---------- Epoch: 265/300 train loss: 0.5778651704360088 ---------- Epoch: 266/300 train loss: 0.5702718371761841 ---------- Epoch: 267/300 train loss: 0.5581895816067793 ---------- Epoch: 268/300 train loss: 0.5675852225845444 ---------- Epoch: 269/300 train loss: 0.5733042543886616 ---------- Epoch: 270/300 train loss: 0.5750072927851426 ---------- Epoch: 271/300 train loss: 0.56704161253876 ---------- Epoch: 272/300 train loss: 0.57172726606806 ---------- Epoch: 273/300 train loss: 0.566815867604855 ---------- Epoch: 274/300 train loss: 0.5693643953962592 ---------- Epoch: 275/300 train loss: 0.5793589622981777 ---------- Epoch: 276/300 train loss: 0.5724858647160486 ---------- Epoch: 277/300 train loss: 0.574933646264091 ---------- Epoch: 278/300 train loss: 0.5787208814739073 ---------- Epoch: 279/300 train loss: 0.572715374522903 ---------- Epoch: 280/300 train loss: 0.5646711404294052 ---------- Epoch: 281/300 train loss: 0.5654006999897145 ---------- Epoch: 282/300 train loss: 0.5695910326467579 ---------- Epoch: 283/300 train loss: 0.5692749171987775 ---------- Epoch: 284/300 train loss: 0.5623931064509755 ---------- Epoch: 285/300 train loss: 0.5706357027724063 ---------- Epoch: 286/300 train loss: 0.5685308650366662 ---------- Epoch: 287/300 train loss: 0.5641438151845253 ---------- Epoch: 288/300 train loss: 0.5641026778302326 ---------- Epoch: 289/300 train loss: 0.5752063679621315 ---------- Epoch: 290/300 train loss: 0.5673458081649922 ---------- Epoch: 291/300 train loss: 0.5645986811110848 ---------- Epoch: 292/300 train loss: 0.5719952690343001 ---------- Epoch: 293/300 train loss: 0.5668923838219776 ---------- Epoch: 294/300 train loss: 0.5691729476761892 ---------- Epoch: 295/300 train loss: 0.5649229059278411 ---------- Epoch: 296/300 train loss: 0.5668271249477339 ---------- Epoch: 297/300 train loss: 0.5688653608593779 ---------- Epoch: 298/300 train loss: 0.5739670262986293 ---------- Epoch: 299/300 train loss: 0.5699526408323932 ---------- Epoch: 300/300 train loss: 0.5748210990392018 Traceback (most recent call last): File "main.py", line 83, in print('程序总运行时间:%s毫秒' % ((T2 - T1) * 1000)) NameError: name 'T1' is not defined