Skip to content

Commit 6cfe54b

Browse files
author
Talia Chopra
committed
documentation: upating version number.
2 parents 6328bb0 + f09d912 commit 6cfe54b

File tree

6 files changed

+41
-11
lines changed

6 files changed

+41
-11
lines changed
Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
2+
Version 1.0.0 (Latest)
3+
======================
4+
5+
.. toctree::
6+
:maxdepth: 1
7+
8+
v1.0.0/smd_data_parallel_pytorch.rst
9+
v1.0.0/smd_data_parallel_tensorflow.rst

doc/api/training/sdp_versions/v1_1_0.rst

Lines changed: 0 additions & 9 deletions
This file was deleted.

doc/api/training/smd_data_parallel.rst

Lines changed: 12 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -84,7 +84,7 @@ Select a version to see the API documentation for version.
8484
.. toctree::
8585
:maxdepth: 1
8686

87-
sdp_versions/v1_1_0.rst
87+
sdp_versions/v1_0_0.rst
8888

8989
.. important::
9090
The distributed data parallel library only supports training jobs using CUDA 11. When you define a PyTorch or TensorFlow
@@ -93,4 +93,14 @@ Select a version to see the API documentation for version.
9393
you must use a CUDA 11 base image. See
9494
`SageMaker Python SDK's distributed data parallel library APIs
9595
<https://docs.aws.amazon.com/sagemaker/latest/dg/data-parallel-use-api.html#data-parallel-use-python-skd-api>`_
96-
for more information.
96+
for more information.
97+
98+
99+
Release Notes
100+
=============
101+
102+
New features, bug fixes, and improvements are regularly made to the SageMaker distributed data parallel library.
103+
104+
To see the the latest changes made to the library, refer to the library
105+
`Release Notes
106+
<https://github.com/aws/sagemaker-python-sdk/blob/master/doc/api/training/smd_data_parallel_release_notes/>`_.
Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
# Sagemaker Distributed Data Parallel 1.0.0 Release Notes
2+
3+
- First Release
4+
- Getting Started
5+
6+
## First Release
7+
8+
SageMaker's distributed data parallel library extends SageMaker’s training
9+
capabilities on deep learning models with near-linear scaling efficiency,
10+
achieving fast time-to-train with minimal code changes.
11+
SageMaker Distributed Data Parallel:
12+
13+
- optimizes your training job for AWS network infrastructure and EC2 instance topology.
14+
- takes advantage of gradient update to communicate between nodes with a custom AllReduce algorithm.
15+
16+
The library currently supports TensorFlow v2 and PyTorch via [AWS Deep Learning Containers](https://aws.amazon.com/machine-learning/containers/).
17+
18+
## Getting Started
19+
20+
For getting started, refer to [SageMaker Distributed Data Parallel Python SDK Guide](https://docs.aws.amazon.com/sagemaker/latest/dg/data-parallel-use-api.html#data-parallel-use-python-skd-api).

0 commit comments

Comments
 (0)