We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent dde3846 commit b7db86aCopy full SHA for b7db86a
doc/frameworks/pytorch/using_pytorch.rst
@@ -319,7 +319,7 @@ to run a distributed training job on two ``ml.p4d.24xlarge`` instances.
319
.. note::
320
321
For more information about setting up ``torchrun`` in your training script,
322
- see `torchrun (Elastic Launch) <https://pytorch.org/docs/stable/elastic/run.html>_` in *the
+ see `torchrun (Elastic Launch) <https://pytorch.org/docs/stable/elastic/run.html>`_ in *the
323
PyTorch documentation*.
324
325
----
0 commit comments