Skip to content

Commit 6a14146

Browse files
authored
Custom Plugin is_distributed (#6537)
* return from plugin * dont return for tpu
1 parent 6453091 commit 6a14146

File tree

1 file changed

+4
-0
lines changed

1 file changed

+4
-0
lines changed

pytorch_lightning/trainer/connectors/accelerator_connector.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -273,6 +273,10 @@ def use_deepspeed(self) -> bool:
273273

274274
@property
275275
def is_distributed(self) -> bool:
276+
# Used for custom plugins.
277+
# Custom plugins should implement is_distributed property.
278+
if hasattr(self.training_type_plugin, 'is_distributed') and not self.on_tpu:
279+
return self.training_type_plugin.is_distributed
276280
is_distributed = self.use_ddp or self.use_ddp2 or self.use_horovod
277281
if self.on_tpu:
278282
is_distributed |= self.training_type_plugin.is_distributed

0 commit comments

Comments
 (0)