Skip to content

Commit 0e8f4a8

Browse files
amogkamlexierule
authored andcommitted
Custom Plugin is_distributed (#6537)
* return from plugin * dont return for tpu (cherry picked from commit 6a14146)
1 parent c8fb646 commit 0e8f4a8

File tree

1 file changed

+4
-0
lines changed

1 file changed

+4
-0
lines changed

pytorch_lightning/trainer/connectors/accelerator_connector.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -266,6 +266,10 @@ def use_deepspeed(self) -> bool:
266266

267267
@property
268268
def is_distributed(self) -> bool:
269+
# Used for custom plugins.
270+
# Custom plugins should implement is_distributed property.
271+
if hasattr(self.training_type_plugin, 'is_distributed') and not self.on_tpu:
272+
return self.training_type_plugin.is_distributed
269273
is_distributed = self.use_ddp or self.use_ddp2 or self.use_horovod
270274
if self.on_tpu:
271275
is_distributed |= self.training_type_plugin.is_distributed

0 commit comments

Comments
 (0)