Pārlūkot izejas kodu

Fixed logging level in distributed mode (#4284)

Co-authored-by: fkwong <huangfuqiang@transai.cn>
modifyDataloader
imyhxy GitHub pirms 3 gadiem
vecāks
revīzija
771ac6c53d
Šim parakstam datu bāzē netika atrasta zināma atslēga GPG atslēgas ID: 4AEE18F83AFDEB23
1 mainītis faili ar 1 papildinājumiem un 1 dzēšanām
  1. +1
    -1
      utils/torch_utils.py

+ 1
- 1
utils/torch_utils.py Parādīt failu

@@ -23,7 +23,6 @@ try:
except ImportError:
thop = None

logging.basicConfig(format="%(message)s", level=logging.INFO)
LOGGER = logging.getLogger(__name__)


@@ -108,6 +107,7 @@ def profile(input, ops, n=10, device=None):
# profile(input, [m1, m2], n=100) # profile over 100 iterations

results = []
logging.basicConfig(format="%(message)s", level=logging.INFO)
device = device or select_device()
print(f"{'Params':>12s}{'GFLOPs':>12s}{'GPU_mem (GB)':>14s}{'forward (ms)':>14s}{'backward (ms)':>14s}"
f"{'input':>24s}{'output':>24s}")

Notiek ielāde…
Atcelt
Saglabāt