You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# Sagemaker Distributed Data Parallel - Release Notes
2
+
3
+
- First Release
4
+
- Getting Started
5
+
6
+
## First Release
7
+
SageMaker's distributed data parallel library extends SageMaker’s training
8
+
capabilities on deep learning models with near-linear scaling efficiency,
9
+
achieving fast time-to-train with minimal code changes.
10
+
SageMaker Distributed Data Parallel :
11
+
12
+
- optimizes your training job for AWS network infrastructure and EC2 instance topology.
13
+
- takes advantage of gradient update to communicate between nodes with a custom AllReduce algorithm.
14
+
15
+
The library currently supports Tensorflow v2 and PyTorch via [AWS Deep Learning Containers](https://aws.amazon.com/machine-learning/containers/).
16
+
17
+
## Getting Started
18
+
For getting started, refer to [SageMaker Distributed Data Parallel Python SDK Guide](https://docs.aws.amazon.com/sagemaker/latest/dg/data-parallel-use-api.html#data-parallel-use-python-skd-api).
0 commit comments