Skip to content

Commit 33fcb1d

Browse files
authored
Update doc/api/training/smd_data_parallel_release_notes/smd_data_parallel_change_log.md
1 parent be71c94 commit 33fcb1d

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

doc/api/training/smd_data_parallel_release_notes/smd_data_parallel_change_log.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77
SageMaker's distributed data parallel library extends SageMaker’s training
88
capabilities on deep learning models with near-linear scaling efficiency,
99
achieving fast time-to-train with minimal code changes.
10-
SageMaker Distributed Data Parallel :
10+
SageMaker Distributed Data Parallel:
1111

1212
- optimizes your training job for AWS network infrastructure and EC2 instance topology.
1313
- takes advantage of gradient update to communicate between nodes with a custom AllReduce algorithm.

0 commit comments

Comments
 (0)