Skip to content

Commit 3c99bfd

Browse files
SeanNarenBorda
authored andcommitted
[fix] Ensure we check deepspeed/sharded in multinode DDP (#6297)
* Ensure we check deepspeed/sharded in multinode * Add CHANGELOG.md * Add CHANGELOG.md * Drop mock, use actual multi-gpu node
1 parent 8578ffa commit 3c99bfd

File tree

2 files changed

+5
-6
lines changed

2 files changed

+5
-6
lines changed

CHANGELOG.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -32,6 +32,10 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
3232

3333
- Fixed `SingleTPU` calling `all_gather` ([#6296](https://github.com/PyTorchLightning/pytorch-lightning/pull/6296))
3434

35+
36+
- Ensure we check deepspeed/sharded in multinode DDP ([#6297](https://github.com/PyTorchLightning/pytorch-lightning/pull/6297)
37+
38+
3539
## [1.2.2] - 2021-03-02
3640

3741
### Added

pytorch_lightning/profiler/profilers.py

Lines changed: 1 addition & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -13,24 +13,19 @@
1313
# limitations under the License.
1414
"""Profiler to check if there are any bottlenecks in your code."""
1515
import cProfile
16-
import inspect
1716
import io
1817
import os
1918
import pstats
2019
import time
2120
from abc import ABC, abstractmethod
2221
from collections import defaultdict
2322
from contextlib import contextmanager
24-
from typing import List, Optional, Union
23+
from typing import Optional, Union
2524

2625
import numpy as np
27-
import torch
2826

2927
from pytorch_lightning import _logger as log
30-
from pytorch_lightning.utilities import rank_zero_only
3128
from pytorch_lightning.utilities.cloud_io import get_filesystem
32-
from pytorch_lightning.utilities.distributed import rank_zero_warn
33-
from pytorch_lightning.utilities.exceptions import MisconfigurationException
3429

3530

3631
class BaseProfiler(ABC):

0 commit comments

Comments
 (0)