Quellcode durchsuchen

Suppress `torch` AMP-CPU warnings (#6706)

This is a torch bug, but they seem unable or unwilling to fix it so I'm creating a suppression in YOLOv5. 

Resolves https://github.com/ultralytics/yolov5/issues/6692
modifyDataloader
Glenn Jocher GitHub vor 2 Jahren
Ursprung
Commit
4de8b24881
Es konnte kein GPG-Schlüssel zu dieser Signatur gefunden werden GPG-Schlüssel-ID: 4AEE18F83AFDEB23
1 geänderte Dateien mit 7 neuen und 7 gelöschten Zeilen
  1. +7
    -7
      utils/torch_utils.py

+ 7
- 7
utils/torch_utils.py Datei anzeigen

@@ -9,6 +9,7 @@ import os
import platform
import subprocess
import time
import warnings
from contextlib import contextmanager
from copy import deepcopy
from pathlib import Path
@@ -25,6 +26,9 @@ try:
except ImportError:
thop = None

# Suppress PyTorch warnings
warnings.filterwarnings('ignore', message='User provided device_type of \'cuda\', but CUDA is not available. Disabling')


@contextmanager
def torch_distributed_zero_first(local_rank: int):
@@ -293,13 +297,9 @@ class EarlyStopping:


class ModelEMA:
""" Model Exponential Moving Average from https://github.com/rwightman/pytorch-image-models
Keep a moving average of everything in the model state_dict (parameters and buffers).
This is intended to allow functionality like
https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage
A smoothed version of the weights is necessary for some training schemes to perform well.
This class is sensitive where it is initialized in the sequence of model init,
GPU assignment and distributed training wrappers.
""" Updated Exponential Moving Average (EMA) from https://github.com/rwightman/pytorch-image-models
Keeps a moving average of everything in the model state_dict (parameters and buffers)
For EMA details see https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage
"""

def __init__(self, model, decay=0.9999, updates=0):

Laden…
Abbrechen
Speichern