Skip to content

Commit bda612e

Browse files
committed
Merge remote-tracking branch 'origin' into feat/jumpstart-private-model-artifacts
2 parents 5010e60 + 5acad56 commit bda612e

File tree

63 files changed

+4336
-193
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

63 files changed

+4336
-193
lines changed

CHANGELOG.md

Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,28 @@
11
# Changelog
22

3+
## v2.193.0 (2023-10-18)
4+
5+
### Features
6+
7+
* jumpstart model artifact instance type variants
8+
* jumpstart instance specific hyperparameters
9+
* Feature Processor event based triggers (#1132)
10+
* Support job checkpoint in remote function
11+
* jumpstart model package arn instance type variants
12+
13+
### Bug Fixes and Other Changes
14+
15+
* Fix hyperlinks in feature_processor.scheduler parameter descriptions
16+
* add image_uris_unit_test pytest mark
17+
* bump apache-airflow to `v2.7.2`
18+
* clone distribution in validate_distribution
19+
* fix flaky Inference Recommender integration tests
20+
21+
### Documentation Changes
22+
23+
* Update PipelineModel.register documentation
24+
* specify that input_shape in no longer required for torch 2.0 mod…
25+
326
## v2.192.1 (2023-10-13)
427

528
### Bug Fixes and Other Changes

VERSION

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
2.192.2.dev0
1+
2.193.1.dev0

doc/amazon_sagemaker_featurestore.rst

Lines changed: 8 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -230,9 +230,11 @@ The following code from the fraud detection example shows a minimal
230230
    enable_online_store=True
231231
)
232232
233-
Creating a feature group takes time as the data is loaded. You will need
234-
to wait until it is created before you can use it. You can check status
235-
using the following method.
233+
Creating a feature group takes time as the data is loaded. You will
234+
need to wait until it is created before you can use it. You can
235+
check status using the following method. Note that it can take
236+
approximately 10-15 minutes to provision an online ``FeatureGroup``
237+
with the ``InMemory`` ``StorageType``.
236238

237239
.. code:: python
238240
@@ -480,7 +482,9 @@ Feature Store `DatasetBuilder API Reference
480482
.. rubric:: Delete a feature group
481483
:name: bCe9CA61b78
482484

483-
You can delete a feature group with the ``delete`` function.
485+
You can delete a feature group with the ``delete`` function. Note that it
486+
can take approximately 10-15 minutes to delete an online ``FeatureGroup``
487+
with the ``InMemory`` ``StorageType``.
484488

485489
.. code:: python
486490

requirements/extras/test_requirements.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ awslogs==0.14.0
1212
black==22.3.0
1313
stopit==1.1.2
1414
# Update tox.ini to have correct version of airflow constraints file
15-
apache-airflow==2.7.1
15+
apache-airflow==2.7.2
1616
apache-airflow-providers-amazon==7.2.1
1717
attrs>=23.1.0,<24
1818
fabric==2.6.0

src/sagemaker/djl_inference/model.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -781,7 +781,7 @@ def serving_image_uri(self, region_name):
781781
str: The appropriate image URI based on the given parameters.
782782
"""
783783
if not self.djl_version:
784-
self.djl_version = "0.23.0"
784+
self.djl_version = "0.24.0"
785785

786786
return image_uris.retrieve(
787787
self._framework(),

src/sagemaker/feature_store/feature_processor/__init__.py

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -30,8 +30,16 @@
3030
to_pipeline,
3131
schedule,
3232
describe,
33+
put_trigger,
34+
delete_trigger,
35+
enable_trigger,
36+
disable_trigger,
3337
delete_schedule,
3438
list_pipelines,
3539
execute,
3640
TransformationCode,
41+
FeatureProcessorPipelineEvents,
42+
)
43+
from sagemaker.feature_store.feature_processor._enums import ( # noqa: F401
44+
FeatureProcessorPipelineExecutionStatus,
3745
)

src/sagemaker/feature_store/feature_processor/_constants.py

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,7 @@
1717

1818
DEFAULT_INSTANCE_TYPE = "ml.m5.xlarge"
1919
DEFAULT_SCHEDULE_STATE = "ENABLED"
20+
DEFAULT_TRIGGER_STATE = "ENABLED"
2021
UNDERSCORE = "_"
2122
RESOURCE_NOT_FOUND_EXCEPTION = "ResourceNotFoundException"
2223
RESOURCE_NOT_FOUND = "ResourceNotFound"
@@ -36,6 +37,8 @@
3637
FEATURE_PROCESSOR_TAG_KEY = "sm-fs-fe:created-from"
3738
FEATURE_PROCESSOR_TAG_VALUE = "fp-to-pipeline"
3839
FEATURE_GROUP_ARN_REGEX_PATTERN = r"arn:(.*?):sagemaker:(.*?):(.*?):feature-group/(.*?)$"
40+
PIPELINE_ARN_REGEX_PATTERN = r"arn:(.*?):sagemaker:(.*?):(.*?):pipeline/(.*?)$"
41+
EVENTBRIDGE_RULE_ARN_REGEX_PATTERN = r"arn:(.*?):events:(.*?):(.*?):rule/(.*?)$"
3942
SAGEMAKER_WHL_FILE_S3_PATH = "s3://ada-private-beta/sagemaker-2.151.1.dev0-py2.py3-none-any.whl"
4043
S3_DATA_DISTRIBUTION_TYPE = "FullyReplicated"
4144
PIPELINE_CONTEXT_NAME_TAG_KEY = "sm-fs-fe:feature-engineering-pipeline-context-name"
@@ -45,3 +48,7 @@
4548
PIPELINE_CONTEXT_NAME_TAG_KEY,
4649
PIPELINE_VERSION_CONTEXT_NAME_TAG_KEY,
4750
]
51+
BASE_EVENT_PATTERN = {
52+
"source": ["aws.sagemaker"],
53+
"detail": {"currentPipelineExecutionStatus": [], "pipelineArn": []},
54+
}

src/sagemaker/feature_store/feature_processor/_enums.py

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -21,3 +21,13 @@ class FeatureProcessorMode(Enum):
2121

2222
PYSPARK = "pyspark" # Execute a pyspark job.
2323
PYTHON = "python" # Execute a regular python script.
24+
25+
26+
class FeatureProcessorPipelineExecutionStatus(Enum):
27+
"""Enum of feature_processor pipeline execution status."""
28+
29+
EXECUTING = "Executing"
30+
STOPPING = "Stopping"
31+
STOPPED = "Stopped"
32+
FAILED = "Failed"
33+
SUCCEEDED = "Succeeded"

0 commit comments

Comments
 (0)