Skip to content

Commit 197c31b

Browse files
authored
Merge branch 'master' into more-docstyle-improvements
2 parents e968282 + 23d2b7f commit 197c31b

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

72 files changed

+1251
-3437
lines changed

CHANGELOG.md

Lines changed: 69 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,74 @@
11
# Changelog
22

3+
## v2.14.0 (2020-10-05)
4+
5+
### Features
6+
7+
* upgrade Neo MxNet to 1.7
8+
9+
### Bug Fixes and Other Changes
10+
11+
* add a condition to retrieve correct image URI for xgboost
12+
13+
## v2.13.0 (2020-09-30)
14+
15+
### Features
16+
17+
* add xgboost framework version 1.2-1
18+
19+
### Bug Fixes and Other Changes
20+
21+
* revert "feature: upgrade Neo MxNet to 1.7 (#1928)"
22+
23+
## v2.12.0 (2020-09-29)
24+
25+
### Features
26+
27+
* upgrade Neo MxNet to 1.7
28+
29+
## v2.11.0 (2020-09-28)
30+
31+
### Features
32+
33+
* Add SDK support for SparkML Serving Container version 2.4
34+
35+
### Bug Fixes and Other Changes
36+
37+
* pin pytest version <6.1.0 to avoid pytest-rerunfailures breaking changes
38+
* temporarily skip the MxNet Neo test until we fix them
39+
40+
### Documentation Changes
41+
42+
* fix conda setup for docs
43+
44+
## v2.10.0 (2020-09-23)
45+
46+
### Features
47+
48+
* add inferentia pytorch inference container config
49+
50+
## v2.9.2 (2020-09-21)
51+
52+
### Bug Fixes and Other Changes
53+
54+
* allow kms encryption upload for processing
55+
56+
## v2.9.1 (2020-09-17)
57+
58+
### Bug Fixes and Other Changes
59+
60+
* update spark image_uri config with eu-north-1 account
61+
62+
## v2.9.0 (2020-09-17)
63+
64+
### Features
65+
66+
* add MXNet 1.7.0 images
67+
68+
### Documentation Changes
69+
70+
* removed Kubernetes workflow content
71+
372
## v2.8.0 (2020-09-16)
473

574
### Features

README.rst

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -53,7 +53,6 @@ Table of Contents
5353
#. `Secure Training and Inference with VPC <https://sagemaker.readthedocs.io/en/stable/overview.html#secure-training-and-inference-with-vpc>`__
5454
#. `BYO Model <https://sagemaker.readthedocs.io/en/stable/overview.html#byo-model>`__
5555
#. `Inference Pipelines <https://sagemaker.readthedocs.io/en/stable/overview.html#inference-pipelines>`__
56-
#. `Amazon SageMaker Operators for Kubernetes <https://sagemaker.readthedocs.io/en/stable/amazon_sagemaker_operators_for_kubernetes.html>`__
5756
#. `Amazon SageMaker Operators in Apache Airflow <https://sagemaker.readthedocs.io/en/stable/using_workflow.html>`__
5857
#. `SageMaker Autopilot <src/sagemaker/automl/README.rst>`__
5958
#. `Model Monitoring <https://sagemaker.readthedocs.io/en/stable/amazon_sagemaker_model_monitoring.html>`__
@@ -165,7 +164,7 @@ Setup a Python environment, and install the dependencies listed in ``doc/require
165164
# conda
166165
conda create -n sagemaker python=3.7
167166
conda activate sagemaker
168-
conda install --file doc/requirements.txt
167+
conda install sphinx=3.1.1 sphinx_rtd_theme=0.5.0
169168

170169
# pip
171170
pip install -r doc/requirements.txt
@@ -202,7 +201,7 @@ In order to host a SparkML model in SageMaker, it should be serialized with ``ML
202201

203202
For more information on MLeap, see https://github.com/combust/mleap .
204203

205-
Supported major version of Spark: 2.2 (MLeap version - 0.9.6)
204+
Supported major version of Spark: 2.4 (MLeap version - 0.9.6)
206205

207206
Here is an example on how to create an instance of ``SparkMLModel`` class and use ``deploy()`` method to create an
208207
endpoint which can be used to perform prediction against your trained SparkML Model.

VERSION

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
2.8.1.dev0
1+
2.14.1.dev0

doc/amazon_sagemaker_processing.rst

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -69,15 +69,15 @@ For an in-depth look, please see the `Scikit-learn Data Processing and Model Eva
6969

7070
Data Processing with Spark
7171
============================================
72-
SageMaker provides two classes for customers to run Spark applications: :class:`sagemaker.processing.PySparkProcessor` and :class:`sagemaker.processing.SparkJarProcessor`
72+
SageMaker provides two classes for customers to run Spark applications: :class:`sagemaker.spark.processing.PySparkProcessor` and :class:`sagemaker.spark.processing.SparkJarProcessor`
7373

7474

7575
PySparkProcessor
7676
---------------------
7777

78-
You can use the :class:`sagemaker.processing.PySparkProcessor` class to run PySpark scripts as processing jobs.
78+
You can use the :class:`sagemaker.spark.processing.PySparkProcessor` class to run PySpark scripts as processing jobs.
7979

80-
This example shows how you can take an existing PySpark script and run a processing job with the :class:`sagemaker.processing.PySparkProcessor` class and the pre-built SageMaker Spark container.
80+
This example shows how you can take an existing PySpark script and run a processing job with the :class:`sagemaker.spark.processing.PySparkProcessor` class and the pre-built SageMaker Spark container.
8181

8282
First you need to create a :class:`PySparkProcessor` object
8383

@@ -230,8 +230,8 @@ Processing class documentation
230230
- :class:`sagemaker.processing.Processor`
231231
- :class:`sagemaker.processing.ScriptProcessor`
232232
- :class:`sagemaker.sklearn.processing.SKLearnProcessor`
233-
- :class:`sagemaker.sklearn.processing.PySparkProcessor`
234-
- :class:`sagemaker.sklearn.processing.SparkJarProcessor`
233+
- :class:`sagemaker.spark.processing.PySparkProcessor`
234+
- :class:`sagemaker.spark.processing.SparkJarProcessor`
235235
- :class:`sagemaker.processing.ProcessingInput`
236236
- :class:`sagemaker.processing.ProcessingOutput`
237237
- :class:`sagemaker.processing.ProcessingJob`

doc/api/training/processing.rst

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,3 +5,8 @@ Processing
55
:members:
66
:undoc-members:
77
:show-inheritance:
8+
9+
.. automodule:: sagemaker.spark.processing
10+
:members:
11+
:undoc-members:
12+
:show-inheritance:

doc/overview.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -672,7 +672,7 @@ You can install all necessary for this feature dependencies using pip:
672672
For more detailed examples of running hyperparameter tuning jobs, see:
673673

674674
- `Using the TensorFlow estimator with hyperparameter tuning <https://github.com/awslabs/amazon-sagemaker-examples/blob/master/hyperparameter_tuning/tensorflow_mnist/hpo_tensorflow_mnist.ipynb>`__
675-
- `Bringing your own estimator for hyperparameter tuning <https://github.com/awslabs/amazon-sagemaker-examples/blob/master/hyperparameter_tuning/r_bring_your_own/hpo_r_bring_your_own.ipynb>`__
675+
- `Bringing your own estimator for hyperparameter tuning <https://github.com/awslabs/amazon-sagemaker-examples/blob/master/hyperparameter_tuning/r_bring_your_own/tune_r_bring_your_own.ipynb>`__
676676
- `Analyzing results <https://github.com/awslabs/amazon-sagemaker-examples/blob/master/hyperparameter_tuning/analyze_results/HPO_Analyze_TuningJob_Results.ipynb>`__
677677

678678
You can also find these notebooks in the **Hyperprameter Tuning** section of the **SageMaker Examples** section in a notebook instance.

doc/v2.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -231,7 +231,7 @@ The following estimator parameters have been renamed:
231231
+------------------------------+------------------------+
232232
| ``train_use_spot_instances`` | ``use_spot_instances`` |
233233
+------------------------------+------------------------+
234-
| ``train_max_run_wait`` | ``max_wait`` |
234+
| ``train_max_wait`` | ``max_wait`` |
235235
+------------------------------+------------------------+
236236
| ``train_volume_size`` | ``volume_size`` |
237237
+------------------------------+------------------------+

doc/workflows/index.rst

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,5 +8,4 @@ The SageMaker Python SDK supports managed training and inference for a variety o
88
:maxdepth: 2
99

1010
airflow/index
11-
kubernetes/index
1211
step_functions/index

doc/workflows/kubernetes/amazon_sagemaker_components_for_kubeflow_pipelines.rst

Lines changed: 0 additions & 209 deletions
This file was deleted.

0 commit comments

Comments
 (0)