Skip to content

Commit 0b5032c

Browse files
authored
Merge branch 'master' into network-config-keyword
2 parents af618c1 + 23d2b7f commit 0b5032c

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

56 files changed

+1020
-138
lines changed

CHANGELOG.md

Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,46 @@
11
# Changelog
22

3+
## v2.14.0 (2020-10-05)
4+
5+
### Features
6+
7+
* upgrade Neo MxNet to 1.7
8+
9+
### Bug Fixes and Other Changes
10+
11+
* add a condition to retrieve correct image URI for xgboost
12+
13+
## v2.13.0 (2020-09-30)
14+
15+
### Features
16+
17+
* add xgboost framework version 1.2-1
18+
19+
### Bug Fixes and Other Changes
20+
21+
* revert "feature: upgrade Neo MxNet to 1.7 (#1928)"
22+
23+
## v2.12.0 (2020-09-29)
24+
25+
### Features
26+
27+
* upgrade Neo MxNet to 1.7
28+
29+
## v2.11.0 (2020-09-28)
30+
31+
### Features
32+
33+
* Add SDK support for SparkML Serving Container version 2.4
34+
35+
### Bug Fixes and Other Changes
36+
37+
* pin pytest version <6.1.0 to avoid pytest-rerunfailures breaking changes
38+
* temporarily skip the MxNet Neo test until we fix them
39+
40+
### Documentation Changes
41+
42+
* fix conda setup for docs
43+
344
## v2.10.0 (2020-09-23)
445

546
### Features

README.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -164,7 +164,7 @@ Setup a Python environment, and install the dependencies listed in ``doc/require
164164
# conda
165165
conda create -n sagemaker python=3.7
166166
conda activate sagemaker
167-
conda install --file doc/requirements.txt
167+
conda install sphinx=3.1.1 sphinx_rtd_theme=0.5.0
168168

169169
# pip
170170
pip install -r doc/requirements.txt
@@ -201,7 +201,7 @@ In order to host a SparkML model in SageMaker, it should be serialized with ``ML
201201

202202
For more information on MLeap, see https://github.com/combust/mleap .
203203

204-
Supported major version of Spark: 2.2 (MLeap version - 0.9.6)
204+
Supported major version of Spark: 2.4 (MLeap version - 0.9.6)
205205

206206
Here is an example on how to create an instance of ``SparkMLModel`` class and use ``deploy()`` method to create an
207207
endpoint which can be used to perform prediction against your trained SparkML Model.

VERSION

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
2.10.1.dev0
1+
2.14.1.dev0

doc/amazon_sagemaker_processing.rst

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -69,15 +69,15 @@ For an in-depth look, please see the `Scikit-learn Data Processing and Model Eva
6969

7070
Data Processing with Spark
7171
============================================
72-
SageMaker provides two classes for customers to run Spark applications: :class:`sagemaker.processing.PySparkProcessor` and :class:`sagemaker.processing.SparkJarProcessor`
72+
SageMaker provides two classes for customers to run Spark applications: :class:`sagemaker.spark.processing.PySparkProcessor` and :class:`sagemaker.spark.processing.SparkJarProcessor`
7373

7474

7575
PySparkProcessor
7676
---------------------
7777

78-
You can use the :class:`sagemaker.processing.PySparkProcessor` class to run PySpark scripts as processing jobs.
78+
You can use the :class:`sagemaker.spark.processing.PySparkProcessor` class to run PySpark scripts as processing jobs.
7979

80-
This example shows how you can take an existing PySpark script and run a processing job with the :class:`sagemaker.processing.PySparkProcessor` class and the pre-built SageMaker Spark container.
80+
This example shows how you can take an existing PySpark script and run a processing job with the :class:`sagemaker.spark.processing.PySparkProcessor` class and the pre-built SageMaker Spark container.
8181

8282
First you need to create a :class:`PySparkProcessor` object
8383

@@ -230,8 +230,8 @@ Processing class documentation
230230
- :class:`sagemaker.processing.Processor`
231231
- :class:`sagemaker.processing.ScriptProcessor`
232232
- :class:`sagemaker.sklearn.processing.SKLearnProcessor`
233-
- :class:`sagemaker.sklearn.processing.PySparkProcessor`
234-
- :class:`sagemaker.sklearn.processing.SparkJarProcessor`
233+
- :class:`sagemaker.spark.processing.PySparkProcessor`
234+
- :class:`sagemaker.spark.processing.SparkJarProcessor`
235235
- :class:`sagemaker.processing.ProcessingInput`
236236
- :class:`sagemaker.processing.ProcessingOutput`
237237
- :class:`sagemaker.processing.ProcessingJob`

doc/api/training/processing.rst

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,3 +5,8 @@ Processing
55
:members:
66
:undoc-members:
77
:show-inheritance:
8+
9+
.. automodule:: sagemaker.spark.processing
10+
:members:
11+
:undoc-members:
12+
:show-inheritance:

doc/overview.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -672,7 +672,7 @@ You can install all necessary for this feature dependencies using pip:
672672
For more detailed examples of running hyperparameter tuning jobs, see:
673673

674674
- `Using the TensorFlow estimator with hyperparameter tuning <https://github.com/awslabs/amazon-sagemaker-examples/blob/master/hyperparameter_tuning/tensorflow_mnist/hpo_tensorflow_mnist.ipynb>`__
675-
- `Bringing your own estimator for hyperparameter tuning <https://github.com/awslabs/amazon-sagemaker-examples/blob/master/hyperparameter_tuning/r_bring_your_own/hpo_r_bring_your_own.ipynb>`__
675+
- `Bringing your own estimator for hyperparameter tuning <https://github.com/awslabs/amazon-sagemaker-examples/blob/master/hyperparameter_tuning/r_bring_your_own/tune_r_bring_your_own.ipynb>`__
676676
- `Analyzing results <https://github.com/awslabs/amazon-sagemaker-examples/blob/master/hyperparameter_tuning/analyze_results/HPO_Analyze_TuningJob_Results.ipynb>`__
677677

678678
You can also find these notebooks in the **Hyperprameter Tuning** section of the **SageMaker Examples** section in a notebook instance.

doc/v2.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -231,7 +231,7 @@ The following estimator parameters have been renamed:
231231
+------------------------------+------------------------+
232232
| ``train_use_spot_instances`` | ``use_spot_instances`` |
233233
+------------------------------+------------------------+
234-
| ``train_max_run_wait`` | ``max_wait`` |
234+
| ``train_max_wait`` | ``max_wait`` |
235235
+------------------------------+------------------------+
236236
| ``train_volume_size`` | ``volume_size`` |
237237
+------------------------------+------------------------+

setup.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -61,7 +61,7 @@ def read_version():
6161
extras["all"],
6262
"tox",
6363
"flake8",
64-
"pytest",
64+
"pytest<6.1.0",
6565
"pytest-cov",
6666
"pytest-rerunfailures",
6767
"pytest-xdist",

src/sagemaker/algorithm.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,7 @@
1717
import sagemaker.parameter
1818
from sagemaker import vpc_utils
1919
from sagemaker.deserializers import BytesDeserializer
20+
from sagemaker.deprecations import removed_kwargs
2021
from sagemaker.estimator import EstimatorBase
2122
from sagemaker.serializers import IdentitySerializer
2223
from sagemaker.transformer import Transformer
@@ -291,6 +292,9 @@ def create_model(
291292
Returns:
292293
a Model ready for deployment.
293294
"""
295+
removed_kwargs("content_type", kwargs)
296+
removed_kwargs("accept", kwargs)
297+
294298
if predictor_cls is None:
295299

296300
def predict_wrapper(endpoint, session):

src/sagemaker/amazon/amazon_estimator.py

Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,7 @@
2323
from sagemaker.amazon import validation
2424
from sagemaker.amazon.hyperparameter import Hyperparameter as hp # noqa
2525
from sagemaker.amazon.common import write_numpy_to_dense_tensor
26+
from sagemaker.deprecations import renamed_warning
2627
from sagemaker.estimator import EstimatorBase, _TrainingJob
2728
from sagemaker.inputs import FileSystemInput, TrainingInput
2829
from sagemaker.utils import sagemaker_timestamp
@@ -454,3 +455,22 @@ def upload_numpy_to_s3_shards(
454455
s3.Object(bucket, key_prefix + file).delete()
455456
finally:
456457
raise ex
458+
459+
460+
def get_image_uri(region_name, repo_name, repo_version=1):
461+
"""Deprecated method. Please use sagemaker.image_uris.retrieve().
462+
463+
Args:
464+
region_name: name of the region
465+
repo_name: name of the repo (e.g. xgboost)
466+
repo_version: version of the repo
467+
468+
Returns:
469+
the image uri
470+
"""
471+
renamed_warning("The method get_image_uri")
472+
return image_uris.retrieve(
473+
framework=repo_name,
474+
region=region_name,
475+
version=repo_version,
476+
)

src/sagemaker/amazon/common.py

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,7 @@
2121
import numpy as np
2222

2323
from sagemaker.amazon.record_pb2 import Record
24+
from sagemaker.deprecations import deprecated_class
2425
from sagemaker.deserializers import BaseDeserializer
2526
from sagemaker.serializers import BaseSerializer
2627
from sagemaker.utils import DeferredError
@@ -298,3 +299,7 @@ def _resolve_type(dtype):
298299
if dtype == np.dtype("float32"):
299300
return "Float32"
300301
raise ValueError("Unsupported dtype {} on array".format(dtype))
302+
303+
304+
numpy_to_record_serializer = deprecated_class(RecordSerializer, "numpy_to_record_serializer")
305+
record_deserializer = deprecated_class(RecordDeserializer, "record_deserializer")

src/sagemaker/cli/compatibility/v2/modifiers/training_params.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@
4545
"train_instance_count",
4646
"train_instance_type",
4747
"train_max_run",
48-
"train_max_run_wait",
48+
"train_max_wait",
4949
"train_use_spot_instances",
5050
"train_volume_size",
5151
"train_volume_kms_key",
@@ -63,7 +63,7 @@ def node_should_be_modified(self, node):
6363
- ``train_instance_count``
6464
- ``train_instance_type``
6565
- ``train_max_run``
66-
- ``train_max_run_wait``
66+
- ``train_max_wait``
6767
- ``train_use_spot_instances``
6868
- ``train_volume_kms_key``
6969
- ``train_volume_size``

src/sagemaker/content_types.py

Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
# Copyright 2020 Amazon.com, Inc. or its affiliates. All Rights Reserved.
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License"). You
4+
# may not use this file except in compliance with the License. A copy of
5+
# the License is located at
6+
#
7+
# http://aws.amazon.com/apache2.0/
8+
#
9+
# or in the "license" file accompanying this file. This file is
10+
# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
11+
# ANY KIND, either express or implied. See the License for the specific
12+
# language governing permissions and limitations under the License.
13+
"""Deprecated content type constants. Just use the mime type strings."""
14+
from __future__ import absolute_import
15+
16+
import deprecations
17+
18+
deprecations.removed_warning("The sagemaker.content_types module")
19+
20+
CONTENT_TYPE_JSON = "application/json"
21+
CONTENT_TYPE_CSV = "text/csv"
22+
CONTENT_TYPE_OCTET_STREAM = "application/octet-stream"
23+
CONTENT_TYPE_NPY = "application/x-npy"

0 commit comments

Comments
 (0)