File tree Expand file tree Collapse file tree 6 files changed +46
-4
lines changed
doc/frameworks/tensorflow Expand file tree Collapse file tree 6 files changed +46
-4
lines changed Original file line number Diff line number Diff line change 1
1
# Changelog
2
2
3
+ ## v2.86.0 (2022-04-12)
4
+
5
+ ### Features
6
+
7
+ * Adds Spark Processing Notebook to Notebook Tests
8
+
9
+ ## v2.85.0 (2022-04-11)
10
+
11
+ ### Features
12
+
13
+ * update lambda code on pipeline create/update/upsert for Lamb…
14
+ * jumpstart model url
15
+ * add serverless inference image_uri retrieve support
16
+
17
+ ### Bug Fixes and Other Changes
18
+
19
+ * Add back the Fix for Pipeline variables related customer issues
20
+ * Support file URIs in ProcessingStep's code parameter
21
+
3
22
## v2.84.0 (2022-04-07)
4
23
5
24
### Features
Original file line number Diff line number Diff line change 1
- 2.84 .1.dev0
1
+ 2.86 .1.dev0
Original file line number Diff line number Diff line change @@ -272,8 +272,8 @@ More information on how to create ``export_outputs`` can be found in `specifying
272
272
refer to TensorFlow's `Save and Restore <https://www.tensorflow.org/guide/saved_model >`_ documentation for other ways to control the
273
273
inference-time behavior of your SavedModels.
274
274
275
- Providing Python scripts for pre/pos -processing
276
- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
275
+ Providing Python scripts for pre/post -processing
276
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
277
277
278
278
You can add your customized Python code to process your input and output data.
279
279
This customized Python code must be named ``inference.py `` and specified through the ``entry_point `` parameter:
Original file line number Diff line number Diff line change @@ -478,7 +478,7 @@ def _script_mode_env_vars(self):
478
478
dir_name = None
479
479
if self .uploaded_code :
480
480
script_name = self .uploaded_code .script_name
481
- if self .enable_network_isolation ():
481
+ if self .repacked_model_data or self . enable_network_isolation ():
482
482
dir_name = "/opt/ml/model/code"
483
483
else :
484
484
dir_name = self .uploaded_code .s3_prefix
Original file line number Diff line number Diff line change @@ -119,6 +119,7 @@ echo "set SAGEMAKER_ROLE_ARN=$SAGEMAKER_ROLE_ARN"
119
119
-- region us- west- 2 \
120
120
-- lifecycle- config- name $LIFECYCLE_CONFIG_NAME \
121
121
-- notebook- instance- role- arn $SAGEMAKER_ROLE_ARN \
122
+ ./amazon-sagemaker-examples/sagemaker_processing/spark_distributed_data_processing/sagemaker-spark-processing.ipynb \
122
123
./amazon-sagemaker-examples/advanced_functionality/kmeans_bring_your_own_model/kmeans_bring_your_own_model.ipynb \
123
124
./amazon-sagemaker-examples/advanced_functionality/tensorflow_iris_byom/tensorflow_BYOM_iris.ipynb \
124
125
./amazon-sagemaker-examples/sagemaker-python-sdk/1 P_kmeans_highlevel/kmeans_mnist.ipynb \
Original file line number Diff line number Diff line change @@ -665,3 +665,25 @@ def test_all_framework_models_add_jumpstart_base_name(
665
665
666
666
sagemaker_session .create_model .reset_mock ()
667
667
sagemaker_session .endpoint_from_production_variants .reset_mock ()
668
+
669
+
670
+ @patch ("sagemaker.utils.repack_model" )
671
+ def test_script_mode_model_uses_proper_sagemaker_submit_dir (repack_model , sagemaker_session ):
672
+
673
+ source_dir = "s3://blah/blah/blah"
674
+ t = Model (
675
+ entry_point = ENTRY_POINT_INFERENCE ,
676
+ role = ROLE ,
677
+ sagemaker_session = sagemaker_session ,
678
+ source_dir = source_dir ,
679
+ image_uri = IMAGE_URI ,
680
+ model_data = MODEL_DATA ,
681
+ )
682
+ t .deploy (instance_type = INSTANCE_TYPE , initial_instance_count = INSTANCE_COUNT )
683
+
684
+ assert (
685
+ sagemaker_session .create_model .call_args_list [0 ][0 ][2 ]["Environment" ][
686
+ "SAGEMAKER_SUBMIT_DIRECTORY"
687
+ ]
688
+ == "/opt/ml/model/code"
689
+ )
You can’t perform that action at this time.
0 commit comments