Skip to content

Commit f738ea8

Browse files
committed
Change 'distribution' to 'distributions' in documentation
1 parent b096cd1 commit f738ea8

File tree

3 files changed

+5
-5
lines changed

3 files changed

+5
-5
lines changed

src/sagemaker/mxnet/README.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -207,15 +207,15 @@ If you were previously relying on the default save method, you can now import on
207207
208208
save(args.model_dir, model)
209209
210-
Lastly, if you were relying on the container launching a parameter server for use with distributed training, you must now set ``distribution`` to the following dictionary when creating an MXNet estimator:
210+
Lastly, if you were relying on the container launching a parameter server for use with distributed training, you must now set ``distributions`` to the following dictionary when creating an MXNet estimator:
211211

212212
.. code:: python
213213
214214
from sagemaker.mxnet import MXNet
215215
216216
estimator = MXNet('path-to-distributed-training-script.py',
217217
...,
218-
distribution={'parameter_server': {'enabled': True}})
218+
distributions={'parameter_server': {'enabled': True}})
219219
220220
221221
Using third-party libraries
@@ -321,7 +321,7 @@ The following are optional arguments. When you create an ``MXNet`` object, you c
321321
framework_version and py_version. Refer to: `SageMaker MXNet Docker Containers
322322
<#sagemaker-mxnet-docker-containers>`_ for details on what the Official images support
323323
and where to find the source code to build your custom image.
324-
- ``distribution`` For versions 1.3 and above only.
324+
- ``distributions`` For versions 1.3 and above only.
325325
Specifies information for how to run distributed training.
326326
To launch a parameter server during training, set this argument to:
327327

src/sagemaker/mxnet/estimator.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -67,7 +67,7 @@ def __init__(self, entry_point, source_dir=None, hyperparameters=None, py_versio
6767
Examples:
6868
123.dkr.ecr.us-west-2.amazonaws.com/my-custom-image:1.0
6969
custom-image:latest.
70-
distribution (dict): A dictionary with information on how to run distributed training
70+
distributions (dict): A dictionary with information on how to run distributed training
7171
(default: None).
7272
**kwargs: Additional kwargs passed to the :class:`~sagemaker.estimator.Framework` constructor.
7373
"""

src/sagemaker/tensorflow/estimator.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -199,7 +199,7 @@ def __init__(self, training_steps=None, evaluation_steps=None, checkpoint_path=N
199199
custom-image:latest.
200200
script_mode (bool): If set to True will the estimator will use the Script Mode containers (default: False).
201201
This will be ignored if py_version is set to 'py3'.
202-
distribution (dict): A dictionary with information on how to run distributed training
202+
distributions (dict): A dictionary with information on how to run distributed training
203203
(default: None). Currently we only support distributed training with parameter servers. To enable it
204204
use the following setup:
205205
{

0 commit comments

Comments
 (0)