Skip to content

Commit fbe1ef1

Browse files
committed
Merge branch 'master' into remove-tf-framework-mode
2 parents 61f36b0 + 58443a3 commit fbe1ef1

File tree

46 files changed

+14761
-246
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

46 files changed

+14761
-246
lines changed

README.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,8 @@ These examples show you how to use model-packages and algorithms from AWS Market
193193
- [Using AutoML algorithm](aws_marketplace/using_algorithms/automl) provides a detailed walkthrough on how to use AutoML algorithm from AWS Marketplace.
194194

195195
- [Using Model Packages](aws_marketplace/using_model_packages)
196-
- [Using Model Packages From AWS Marketplace](aws_marketplace/using_model_packages/amazon_demo_product) provides a detailed walkthrough on how to use Model Package entities with the enhanced SageMaker Transform/Hosting APIs by choosing a canonical product listed on AWS Marketplace.
196+
- [Using Model Packages From AWS Marketplace](aws_marketplace/using_model_packages/generic_sample_notebook) is a generic notebook which provides sample code snippets you can modify and use for performing inference on Model Packages from AWS Marketplace, using Amazon SageMaker.
197+
- [Using Amazon Demo product From AWS Marketplace](aws_marketplace/using_model_packages/amazon_demo_product) provides a detailed walkthrough on how to use Model Package entities with the enhanced SageMaker Transform/Hosting APIs by choosing a canonical product listed on AWS Marketplace.
197198
- [Using models for extracting vehicle metadata](aws_marketplace/using_model_packages/auto_insurance) provides a detailed walkthrough on how to use pre-trained models from AWS Marketplace for extracting metadata for a sample use-case of auto-insurance claim processing.
198199
- [Using models for identifying non-compliance at a workplace](aws_marketplace/using_model_packages/improving_industrial_workplace_safety) provides a detailed walkthrough on how to use pre-trained models from AWS Marketplace for extracting metadata for a sample use-case of generating summary reports for identifying non-compliance at a construction/industrial workplace.
199200

advanced_functionality/distributed_tensorflow_mask_rcnn/container-optimized/Dockerfile

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -37,13 +37,14 @@ RUN pip install Cython==0.28.4
3737
RUN pip install pycocotools==2.0.0
3838
RUN pip install matplotlib==3.0.3
3939
RUN pip install markdown==3.1
40+
RUN pip install numpy==1.17.5
4041

4142
RUN git clone https://github.com/aws-samples/mask-rcnn-tensorflow
4243
RUN cd /mask-rcnn-tensorflow && git fetch origin 153442bc70b06e59f2bbeadc4d359b240f64cbc2
4344
RUN cd /mask-rcnn-tensorflow && git reset --hard 153442bc70b06e59f2bbeadc4d359b240f64cbc2
4445

4546
RUN chmod -R +w /mask-rcnn-tensorflow
46-
RUN pip install --ignore-installed -e /mask-rcnn-tensorflow/
47+
RUN pip install -e /mask-rcnn-tensorflow/
4748

4849
##########################################################################################
4950
# SageMaker requirements

advanced_functionality/distributed_tensorflow_mask_rcnn/container/Dockerfile

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,11 +31,12 @@ RUN pip install Cython==0.28.4
3131
RUN pip install pycocotools==2.0.0
3232
RUN pip install matplotlib==3.0.3
3333
RUN pip install markdown==3.1
34+
RUN pip install numpy==1.17.5
3435

3536
RUN git clone https://github.com/tensorpack/tensorpack.git /tensorpack
3637
RUN cd /tensorpack && git fetch origin 26664c3f1d58ae029ea6c3ba0af6ae11900b1e55
3738
RUN cd /tensorpack && git reset --hard 26664c3f1d58ae029ea6c3ba0af6ae11900b1e55
38-
RUN pip install --ignore-installed -e /tensorpack
39+
RUN pip install -e /tensorpack
3940

4041
##########################################################################################
4142
# SageMaker requirements

advanced_functionality/kmeans_bring_your_own_model/kmeans_bring_your_own_model.ipynb

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -53,16 +53,17 @@
5353
},
5454
"outputs": [],
5555
"source": [
56-
"bucket = '<your_s3_bucket_name_here>'\n",
57-
"prefix = 'sagemaker/DEMO-kmeans-byom'\n",
58-
" \n",
5956
"# Define IAM role\n",
6057
"import boto3\n",
6158
"import re\n",
6259
" \n",
60+
"import sagemaker\n",
6361
"from sagemaker import get_execution_role\n",
6462
"\n",
65-
"role = get_execution_role()"
63+
"role = get_execution_role()\n",
64+
"bucket = sagemaker.Session().default_bucket()\n",
65+
"prefix = 'sagemaker/DEMO-kmeans-byom'\n",
66+
" \n"
6667
]
6768
},
6869
{

advanced_functionality/pytorch_extending_our_containers/pytorch_extending_our_containers.ipynb

Lines changed: 17 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -121,17 +121,17 @@
121121
"A number of files are laid out for your use, under the `/opt/ml` directory:\n",
122122
"\n",
123123
" /opt/ml\n",
124-
" ├── input\n",
125-
" │   ├── config\n",
126-
" │   │   ├── hyperparameters.json\n",
127-
" │   │   └── resourceConfig.json\n",
128-
" │   └── data\n",
129-
" │   └── <channel_name>\n",
130-
" │   └── <input data>\n",
131-
" ├── model\n",
132-
" │   └── <model files>\n",
133-
" └── output\n",
134-
" └── failure\n",
124+
" |-- input\n",
125+
" | |-- config\n",
126+
" | | |-- hyperparameters.json\n",
127+
" | | `-- resourceConfig.json\n",
128+
" | `-- data\n",
129+
" | `-- <channel_name>\n",
130+
" | `-- <input data>\n",
131+
" |-- model\n",
132+
" | `-- <model files>\n",
133+
" `-- output\n",
134+
" `-- failure\n",
135135
"\n",
136136
"##### The input\n",
137137
"\n",
@@ -158,8 +158,8 @@
158158
"The container has the model files in the same place that they were written to during training:\n",
159159
"\n",
160160
" /opt/ml\n",
161-
" └── model\n",
162-
"    └── <model files>\n",
161+
" `-- model\n",
162+
" `-- <model files>\n",
163163
"\n"
164164
]
165165
},
@@ -172,10 +172,10 @@
172172
"The `container` directory has all the components you need to extend the SageMaker PyTorch container to use as an sample algorithm:\n",
173173
"\n",
174174
" .\n",
175-
" ├── Dockerfile\n",
176-
" ├── build_and_push.sh\n",
177-
" └── cifar10\n",
178-
" ├── cifar10.py\n",
175+
" |-- Dockerfile\n",
176+
" |-- build_and_push.sh\n",
177+
" `-- cifar10\n",
178+
" `-- cifar10.py\n",
179179
"\n",
180180
"Let's discuss each of these in turn:\n",
181181
"\n",

advanced_functionality/r_bring_your_own/r_bring_your_own.ipynb

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -50,15 +50,15 @@
5050
},
5151
"outputs": [],
5252
"source": [
53-
"bucket = '<your_s3_bucket_name_here>'\n",
54-
"prefix = 'sagemaker/DEMO-r-byo'\n",
55-
" \n",
5653
"# Define IAM role\n",
5754
"import boto3\n",
5855
"import re\n",
56+
"import sagemaker\n",
5957
"from sagemaker import get_execution_role\n",
6058
"\n",
61-
"role = get_execution_role()"
59+
"role = get_execution_role()\n",
60+
"bucket = sagemaker.Session().default_bucket()\n",
61+
"prefix = 'sagemaker/DEMO-r-byo'"
6262
]
6363
},
6464
{

advanced_functionality/scikit_bring_your_own/scikit_bring_your_own.ipynb

Lines changed: 21 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -121,17 +121,17 @@
121121
"When Amazon SageMaker runs training, your `train` script is run just like a regular Python program. A number of files are laid out for your use, under the `/opt/ml` directory:\n",
122122
"\n",
123123
" /opt/ml\n",
124-
" ├── input\n",
125-
" │   ├── config\n",
126-
" │   │   ├── hyperparameters.json\n",
127-
" │   │   └── resourceConfig.json\n",
128-
" │   └── data\n",
129-
" │   └── <channel_name>\n",
130-
" │   └── <input data>\n",
131-
" ├── model\n",
132-
" │   └── <model files>\n",
133-
" └── output\n",
134-
" └── failure\n",
124+
" |-- input\n",
125+
" | |-- config\n",
126+
" | | |-- hyperparameters.json\n",
127+
" | | `-- resourceConfig.json\n",
128+
" | `-- data\n",
129+
" | `-- <channel_name>\n",
130+
" | `-- <input data>\n",
131+
" |-- model\n",
132+
" | `-- <model files>\n",
133+
" `-- output\n",
134+
" `-- failure\n",
135135
"\n",
136136
"##### The input\n",
137137
"\n",
@@ -160,8 +160,8 @@
160160
"The container will have the model files in the same place they were written during training:\n",
161161
"\n",
162162
" /opt/ml\n",
163-
" └── model\n",
164-
"    └── <model files>\n",
163+
" `-- model\n",
164+
" `-- <model files>\n",
165165
"\n"
166166
]
167167
},
@@ -174,14 +174,14 @@
174174
"In the `container` directory are all the components you need to package the sample algorithm for Amazon SageMager:\n",
175175
"\n",
176176
" .\n",
177-
" ├── Dockerfile\n",
178-
" ├── build_and_push.sh\n",
179-
" └── decision_trees\n",
180-
" ├── nginx.conf\n",
181-
" ├── predictor.py\n",
182-
" ├── serve\n",
183-
" ├── train\n",
184-
" └── wsgi.py\n",
177+
" |-- Dockerfile\n",
178+
" |-- build_and_push.sh\n",
179+
" `-- decision_trees\n",
180+
" |-- nginx.conf\n",
181+
" |-- predictor.py\n",
182+
" |-- serve\n",
183+
" |-- train\n",
184+
" `-- wsgi.py\n",
185185
"\n",
186186
"Let's discuss each of these in turn:\n",
187187
"\n",

advanced_functionality/tensorflow_bring_your_own/tensorflow_bring_your_own.ipynb

Lines changed: 21 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -116,17 +116,17 @@
116116
"When Amazon SageMaker runs training, your `train` script is run, as in a regular Python program. A number of files are laid out for your use, under the `/opt/ml` directory:\n",
117117
"\n",
118118
" /opt/ml\n",
119-
" ├── input\n",
120-
" │   ├── config\n",
121-
" │   │   ├── hyperparameters.json\n",
122-
" │   │   └── resourceConfig.json\n",
123-
" │   └── data\n",
124-
" │   └── <channel_name>\n",
125-
" │   └── <input data>\n",
126-
" ├── model\n",
127-
" │   └── <model files>\n",
128-
" └── output\n",
129-
" └── failure\n",
119+
" |-- input\n",
120+
" | |-- config\n",
121+
" | | |-- hyperparameters.json\n",
122+
" | | `-- resourceConfig.json\n",
123+
" | `-- data\n",
124+
" | `-- <channel_name>\n",
125+
" | `-- <input data>\n",
126+
" |-- model\n",
127+
" | `-- <model files>\n",
128+
" `-- output\n",
129+
" `-- failure\n",
130130
"\n",
131131
"##### The input\n",
132132
"\n",
@@ -151,8 +151,8 @@
151151
"The container has the model files in the same place that they were written to during training:\n",
152152
"\n",
153153
" /opt/ml\n",
154-
" └── model\n",
155-
"    └── <model files>\n",
154+
" `-- model\n",
155+
" `-- <model files>\n",
156156
"\n"
157157
]
158158
},
@@ -165,14 +165,14 @@
165165
"The `container` directory has all the components you need to package the sample algorithm for Amazon SageMager:\n",
166166
"\n",
167167
" .\n",
168-
" ├── Dockerfile\n",
169-
" ├── build_and_push.sh\n",
170-
" └── cifar10\n",
171-
" ├── cifar10.py\n",
172-
" ├── resnet_model.py\n",
173-
" ├── nginx.conf\n",
174-
" ├── serve\n",
175-
" ├── train\n",
168+
" |-- Dockerfile\n",
169+
" |-- build_and_push.sh\n",
170+
" `-- cifar10\n",
171+
" |-- cifar10.py\n",
172+
" |-- resnet_model.py\n",
173+
" |-- nginx.conf\n",
174+
" |-- serve\n",
175+
" `-- train\n",
176176
"\n",
177177
"Let's discuss each of these in turn:\n",
178178
"\n",

aws_marketplace/README.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,8 @@ These examples show you how to use model-packages and algorithms from AWS Market
2121
- [Using AutoML algorithm](using_algorithms/automl) provides a detailed walkthrough on how to use AutoML algorithm from AWS Marketplace.
2222

2323
- [Using Model Packages](using_model_packages)
24-
- [Using Model Packages From AWS Marketplace](using_model_packages/amazon_demo_product) provides a detailed walkthrough on how to use Model Package entities with the enhanced SageMaker Transform/Hosting APIs by choosing a canonical product listed on AWS Marketplace.
24+
- [Using Model Packages From AWS Marketplace](using_model_packages/generic_sample_notebook) is a generic notebook which provides sample code snippets you can modify and use for performing inference on Model Packages from AWS Marketplace, using Amazon SageMaker.
25+
- [Using Amazon Demo product From AWS Marketplace](using_model_packages/amazon_demo_product) provides a detailed walkthrough on how to use Model Package entities with the enhanced SageMaker Transform/Hosting APIs by choosing a canonical product listed on AWS Marketplace.
2526
- [Using models for extracting vehicle metadata](using_model_packages/auto_insurance) provides a detailed walkthrough on how to use pre-trained models from AWS Marketplace for extracting metadata for a sample use-case of auto-insurance claim processing.
2627
- [Using models for identifying non-compliance at a workplace](using_model_packages/improving_industrial_workplace_safety) provides a detailed walkthrough on how to use pre-trained models from AWS Marketplace for extracting metadata for a sample use-case of generating summary reports for identifying non-compliance at a construction/industrial workplace.
2728

aws_marketplace/creating_marketplace_products/Bring_Your_Own-Creating_Algorithm_and_Model_Package.ipynb

Lines changed: 21 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -121,17 +121,17 @@
121121
"When Amazon SageMaker runs training, your `train` script is run just like a regular Python program. A number of files are laid out for your use, under the `/opt/ml` directory:\n",
122122
"\n",
123123
" /opt/ml\n",
124-
" ├── input\n",
125-
" │   ├── config\n",
126-
" │   │   ├── hyperparameters.json\n",
127-
" │   │   └── resourceConfig.json\n",
128-
" │   └── data\n",
129-
" │   └── <channel_name>\n",
130-
" │   └── <input data>\n",
131-
" ├── model\n",
132-
" │   └── <model files>\n",
133-
" └── output\n",
134-
" └── failure\n",
124+
" |-- input\n",
125+
" | |-- config\n",
126+
" | | |-- hyperparameters.json\n",
127+
" | | `-- resourceConfig.json\n",
128+
" | `-- data\n",
129+
" | `-- <channel_name>\n",
130+
" | `-- <input data>\n",
131+
" |-- model\n",
132+
" | `-- <model files>\n",
133+
" `-- output\n",
134+
" `-- failure\n",
135135
"\n",
136136
"##### The input\n",
137137
"\n",
@@ -160,8 +160,8 @@
160160
"The container will have the model files in the same place they were written during training:\n",
161161
"\n",
162162
" /opt/ml\n",
163-
" └── model\n",
164-
"    └── <model files>\n",
163+
" `-- model\n",
164+
" `-- <model files>\n",
165165
"\n"
166166
]
167167
},
@@ -174,14 +174,14 @@
174174
"In the `container` directory are all the components you need to package the sample algorithm for Amazon SageMager:\n",
175175
"\n",
176176
" .\n",
177-
" ├── Dockerfile\n",
178-
" ├── build_and_push.sh\n",
179-
" └── decision_trees\n",
180-
" ├── nginx.conf\n",
181-
" ├── predictor.py\n",
182-
" ├── serve\n",
183-
" ├── train\n",
184-
" └── wsgi.py\n",
177+
" |-- Dockerfile\n",
178+
" |-- build_and_push.sh\n",
179+
" `-- decision_trees\n",
180+
" |-- nginx.conf\n",
181+
" |-- predictor.py\n",
182+
" |-- serve\n",
183+
" |-- train\n",
184+
" `-- wsgi.py\n",
185185
"\n",
186186
"Let's discuss each of these in turn:\n",
187187
"\n",

0 commit comments

Comments
 (0)