Skip to content

Commit b0d1996

Browse files
edenlightningBordakaushikb11
authored
Update gpu warning (#6181)
Co-authored-by: Jirka Borovec <[email protected]> Co-authored-by: Kaushik Bokka <[email protected]>
1 parent c7130b7 commit b0d1996

File tree

1 file changed

+4
-1
lines changed

1 file changed

+4
-1
lines changed

pytorch_lightning/trainer/connectors/accelerator_connector.py

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -547,7 +547,10 @@ def set_distributed_mode(self, distributed_backend: Optional[str] = None):
547547
rank_zero_info(f'TPU available: {_TPU_AVAILABLE}, using: {num_cores} TPU cores')
548548

549549
if torch.cuda.is_available() and self._device_type != DeviceType.GPU:
550-
rank_zero_warn("GPU available but not used. Set the --gpus flag when calling the script.")
550+
rank_zero_warn(
551+
"GPU available but not used. Set the gpus flag in your trainer"
552+
" `Trainer(gpus=1)` or script `--gpus=1`."
553+
)
551554

552555
def _set_horovod_backend(self):
553556
self.check_horovod()

0 commit comments

Comments
 (0)