Skip to content

doc: add missing classes to API docs #1161

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 9 commits into from
Dec 16, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions doc/algorithm.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
Algorithm Estimator
-------------------

.. automodule:: sagemaker.algorithm
:members:
:undoc-members:
:show-inheritance:
12 changes: 12 additions & 0 deletions doc/automl.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
AutoML
------

.. automodule:: sagemaker.automl.automl
:members:
:undoc-members:
:show-inheritance:

.. automodule:: sagemaker.automl.candidate_estimator
:members:
:undoc-members:
:show-inheritance:
33 changes: 27 additions & 6 deletions doc/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,19 +16,42 @@ Overview

overview

The SageMaker Python SDK consists of a few primary classes:
The SageMaker Python SDK consists of a variety classes:

Training:

.. toctree::
:maxdepth: 2
:maxdepth: 1

estimators
algorithm
tuner
parameter
automl
processing
analytics

Inference:

.. toctree::
:maxdepth: 1

model
pipeline
multi_data_model
predictors
transformer
pipeline
model_monitor

Utility:

.. toctree::
:maxdepth: 1

session
analytics
inputs
network
s3

*****
MXNet
Expand Down Expand Up @@ -175,5 +198,3 @@ SageMaker APIs to export configurations for creating and managing Airflow workfl
:maxdepth: 2

sagemaker.workflow.airflow


7 changes: 7 additions & 0 deletions doc/inputs.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
Inputs
------

.. automodule:: sagemaker.inputs
:members:
:undoc-members:
:show-inheritance:
27 changes: 27 additions & 0 deletions doc/model_monitor.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
Model Monitor
-------------

.. automodule:: sagemaker.model_monitor.model_monitoring
:members:
:undoc-members:
:show-inheritance:

.. automodule:: sagemaker.model_monitor.monitoring_files
:members:
:undoc-members:
:show-inheritance:

.. automodule:: sagemaker.model_monitor.dataset_format
:members:
:undoc-members:
:show-inheritance:

.. automodule:: sagemaker.model_monitor.data_capture_config
:members:
:undoc-members:
:show-inheritance:

.. automodule:: sagemaker.model_monitor.cron_expression_generator
:members:
:undoc-members:
:show-inheritance:
7 changes: 7 additions & 0 deletions doc/multi_data_model.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
MultiDataModel
--------------

.. automodule:: sagemaker.multidatamodel
:members:
:undoc-members:
:show-inheritance:
7 changes: 7 additions & 0 deletions doc/network.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
Network Configuration
---------------------

.. automodule:: sagemaker.network
:members:
:undoc-members:
:show-inheritance:
7 changes: 7 additions & 0 deletions doc/parameter.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
Parameters
----------

.. automodule:: sagemaker.parameter
:members:
:undoc-members:
:show-inheritance:
7 changes: 7 additions & 0 deletions doc/processing.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
Processing
----------

.. automodule:: sagemaker.processing
:members:
:undoc-members:
:show-inheritance:
7 changes: 7 additions & 0 deletions doc/s3.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
S3 Utilities
------------

.. automodule:: sagemaker.s3
:members:
:undoc-members:
:show-inheritance:
2 changes: 1 addition & 1 deletion doc/session.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,4 @@ Session
.. automodule:: sagemaker.session
:members:
:undoc-members:
:show-inheritance:
:show-inheritance:
6 changes: 3 additions & 3 deletions doc/using_tf.rst
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@ For example, if you want to use a boolean hyperparameter, specify ``type`` as ``

For a complete example of a TensorFlow training script, see `mnist.py <https://github.com/awslabs/amazon-sagemaker-examples/blob/master/sagemaker-python-sdk/tensorflow_distributed_mnist/mnist.py>`__.


Adapting your local TensorFlow script
-------------------------------------

Expand Down Expand Up @@ -181,7 +181,7 @@ Required arguments

- ``str``: An S3 URI, for example ``s3://my-bucket/my-training-data``, which indicates the dataset's location.
- ``dict[str, str]``: A dictionary mapping channel names to S3 locations, for example ``{'train': 's3://my-bucket/my-training-data/train', 'test': 's3://my-bucket/my-training-data/test'}``
- ``sagemaker.session.s3_input``: channel configuration for S3 data sources that can provide additional information as well as the path to the training dataset. See `the API docs <https://sagemaker.readthedocs.io/en/stable/session.html#sagemaker.session.s3_input>`_ for full details.
- ``sagemaker.session.s3_input``: channel configuration for S3 data sources that can provide additional information as well as the path to the training dataset. See `the API docs <https://sagemaker.readthedocs.io/en/stable/inputs.html#sagemaker.inputs.s3_input>`_ for full details.

Optional arguments
------------------
Expand Down Expand Up @@ -593,7 +593,7 @@ The following content formats are supported without custom intput and output han
For detailed information about how TensorFlow Serving formats these data types for input and output, see :ref:`using_tf:TensorFlow Serving Input and Output`.

You can also accept any custom data format by writing input and output functions, and include them in the ``inference.py`` file in your model.
For information, see :ref:`using_tf:Create Python Scripts for Custom Input and Output Formats`.
For information, see :ref:`using_tf:Create Python Scripts for Custom Input and Output Formats`.


TensorFlow Serving Input and Output
Expand Down
8 changes: 4 additions & 4 deletions src/sagemaker/multidatamodel.py
Original file line number Diff line number Diff line change
Expand Up @@ -253,13 +253,13 @@ def deploy(
return None

def add_model(self, model_data_source, model_data_path=None):
"""Adds a model to the `MultiDataModel` by uploading or copying the model_data_source
"""Adds a model to the ``MultiDataModel`` by uploading or copying the model_data_source
artifact to the given S3 path model_data_path relative to model_data_prefix

Args:
model_source: Valid local file path or S3 path of the trained model artifact
model_data_path: S3 path where the trained model artifact
should be uploaded relative to `self.model_data_prefix` path. (default: None).
model_source: Valid local file path or S3 path of the trained model artifact
model_data_path: S3 path where the trained model artifact
should be uploaded relative to ``self.model_data_prefix`` path. (default: None).
If None, then the model artifact is uploaded to a path relative to model_data_prefix

Returns:
Expand Down
2 changes: 1 addition & 1 deletion src/sagemaker/processing.py
Original file line number Diff line number Diff line change
Expand Up @@ -686,7 +686,7 @@ def __init__(self, source, destination=None, output_name=None, s3_upload_mode="E
source (str): The source for the output.
destination (str): The destination of the output. If a destination
is not provided, one will be generated:
"s3://<default-bucket-name>/<job-name>/output/<output-name>".
"s3://<default-bucket-name>/<job-name>/output/<output-name>".
output_name (str): The name of the output. If a name
is not provided, one will be generated (eg. "output-1").
s3_upload_mode (str): Valid options are "EndOfJob" or "Continuous".
Expand Down