Skip to content

Commit 65307cd

Browse files
authored
Small style updates.
1 parent 6d32d0e commit 65307cd

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

doc/api/training/smd_model_parallel_release_notes/smd_model_parallel_change_log.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -8,14 +8,14 @@
88

99
### PyTorch
1010

11-
#### Add support for PyTorch 1.7
11+
#### Add support for PyTorch (PT) 1.7
1212

13-
- Adds support for `gradient_as_bucket_view` (PT 1.7 only), `find_unused_parameters` (PT 1.7 only) and `broadcast_buffers` options to `smp.DistributedModel`. These options behave the same as the corresponding options (with the same names) in
14-
`torch.DistributedDataParallel` API. Please refer to the [SMP API documentation](https://sagemaker.readthedocs.io/en/stable/api/training/smd_model_parallel_pytorch.html#smp.DistributedModel) for more information.
13+
- Adds support for `gradient_as_bucket_view` (PyTorch 1.7 only), `find_unused_parameters` (PyTorch 1.7 only) and `broadcast_buffers` options to `smp.DistributedModel`. These options behave the same as the corresponding options (with the same names) in
14+
`torch.DistributedDataParallel` API. Please refer to the [SageMaker distributed model parallel API documentation](https://sagemaker.readthedocs.io/en/stable/api/training/smd_model_parallel_pytorch.html#smp.DistributedModel) for more information.
1515

16-
- Adds support for `join` (PT 1.7 only) context manager, which is to be used in conjunction with an instance of `torch.nn.parallel.DistributedDataParallel` to be able to train with uneven inputs across participating processes.
16+
- Adds support for `join` (PyTorch 1.7 only) context manager, which is to be used in conjunction with an instance of `torch.nn.parallel.DistributedDataParallel` to be able to train with uneven inputs across participating processes.
1717

18-
- Adds support for `_register_comm_hook` (PT 1.7 only) which will register the callable as a communication hook for DDP. NOTE: Like in DDP, this is an experimental API and subject to change.
18+
- Adds support for `_register_comm_hook` (PyTorch 1.7 only) which will register the callable as a communication hook for DDP. NOTE: Like in DDP, this is an experimental API and subject to change.
1919

2020
### Tensorflow
2121

0 commit comments

Comments
 (0)