We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent be71c94 commit 33fcb1dCopy full SHA for 33fcb1d
doc/api/training/smd_data_parallel_release_notes/smd_data_parallel_change_log.md
@@ -7,7 +7,7 @@
7
SageMaker's distributed data parallel library extends SageMaker’s training
8
capabilities on deep learning models with near-linear scaling efficiency,
9
achieving fast time-to-train with minimal code changes.
10
-SageMaker Distributed Data Parallel :
+SageMaker Distributed Data Parallel:
11
12
- optimizes your training job for AWS network infrastructure and EC2 instance topology.
13
- takes advantage of gradient update to communicate between nodes with a custom AllReduce algorithm.
0 commit comments