You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: doc/api/training/smd_model_parallel_release_notes/smd_model_parallel_change_log.md
+5-5Lines changed: 5 additions & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -8,14 +8,14 @@
8
8
9
9
### PyTorch
10
10
11
-
#### Add support for PyTorch 1.7
11
+
#### Add support for PyTorch (PT) 1.7
12
12
13
-
- Adds support for `gradient_as_bucket_view` (PT 1.7 only), `find_unused_parameters` (PT 1.7 only) and `broadcast_buffers` options to `smp.DistributedModel`. These options behave the same as the corresponding options (with the same names) in
14
-
`torch.DistributedDataParallel` API. Please refer to the [SMP API documentation](https://sagemaker.readthedocs.io/en/stable/api/training/smd_model_parallel_pytorch.html#smp.DistributedModel) for more information.
13
+
- Adds support for `gradient_as_bucket_view` (PyTorch 1.7 only), `find_unused_parameters` (PyTorch 1.7 only) and `broadcast_buffers` options to `smp.DistributedModel`. These options behave the same as the corresponding options (with the same names) in
14
+
`torch.DistributedDataParallel` API. Please refer to the [SageMaker distributed model parallel API documentation](https://sagemaker.readthedocs.io/en/stable/api/training/smd_model_parallel_pytorch.html#smp.DistributedModel) for more information.
15
15
16
-
- Adds support for `join` (PT 1.7 only) context manager, which is to be used in conjunction with an instance of `torch.nn.parallel.DistributedDataParallel` to be able to train with uneven inputs across participating processes.
16
+
- Adds support for `join` (PyTorch 1.7 only) context manager, which is to be used in conjunction with an instance of `torch.nn.parallel.DistributedDataParallel` to be able to train with uneven inputs across participating processes.
17
17
18
-
- Adds support for `_register_comm_hook` (PT 1.7 only) which will register the callable as a communication hook for DDP. NOTE: Like in DDP, this is an experimental API and subject to change.
18
+
- Adds support for `_register_comm_hook` (PyTorch 1.7 only) which will register the callable as a communication hook for DDP. NOTE: Like in DDP, this is an experimental API and subject to change.
0 commit comments