Skip to content

update tf iris example #49

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Nov 25, 2017
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -4,24 +4,23 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Creating, training and serving Estimators in SageMaker\n",
"# Creating, training, and serving using SageMaker Estimators\n",
"\n",
"The **SageMaker Python SDK** helps you deploy your models for training and hosting in optimized, productions ready containers in SageMaker. The SageMaker Python SDK is easy to use, modular, extensible and compatible with TensorFlow and MXNet. This tutorial focuses on **TensorFlow** and shows how we can train and host a tensorflow DNNClassifier estimator in SageMaker using the Python SDK.\n",
"The **SageMaker Python SDK** helps you deploy your models for training and hosting in optimized, production ready containers in SageMaker. The SageMaker Python SDK is easy to use, modular, extensible and compatible with TensorFlow and MXNet. This tutorial focuses on **TensorFlow** and shows how we can train and host a TensorFlow DNNClassifier estimator in SageMaker using the Python SDK.\n",
"\n",
"\n",
"TensorFlow's high-level machine learning API (tf.estimator) makes it easy to\n",
"configure, train, and evaluate a variety of machine learning models.\n",
"\n",
"\n",
"In this\n",
"tutorial, you'll use tf.estimator to construct a\n",
"In this tutorial, you'll use tf.estimator to construct a\n",
"[neural network](https://en.wikipedia.org/wiki/Artificial_neural_network)\n",
"classifier and train it on the\n",
"[Iris data set](https://en.wikipedia.org/wiki/Iris_flower_data_set) to\n",
"predict flower species based on sepal/petal geometry. You'll write code to\n",
"perform the following five steps:\n",
"\n",
"1. Deploy a tensorflow container in SageMaker\n",
"1. Deploy a TensorFlow container in SageMaker\n",
"2. Load CSVs containing Iris training/test data from a S3 bucket into a TensorFlow `Dataset`\n",
"3. Construct a `tf.estimator.DNNClassifier` neural network classifier\n",
"4. Train the model using the training data\n",
Expand Down Expand Up @@ -77,9 +76,9 @@
" iris_training.csv\n",
"* A test set of 30 samples\n",
" iris_test.csv\n",
" \n",
"These files are provided in the SageMaker sample data bucket: \n",
"s3://sagemaker-sample-data/tensorflow/iris"
"\n",
"These files are provided in the SageMaker sample data bucket:\n",
"**s3://sagemaker-sample-data-{region}/tensorflow/iris**. Copies of the bucket exist in each SageMaker region. When we access the data, we'll replace {region} with the AWS region the notebook is running in."
]
},
{
Expand All @@ -99,8 +98,11 @@
"source": [
"#Bucket location to save your custom code in tar.gz format.\n",
"custom_code_upload_location = 's3://<bucket-name>/customcode/tensorflow_iris'\n",
"\n",
"#Bucket location where results of model training are saved.\n",
"model_artifacts_location = 's3://<bucket-name>/artifacts'\n",
"\n",
"#IAM execution role that gives SageMaker access to resources in your AWS account.\n",
"role='<your SageMaker execution role here>'"
]
},
Expand Down Expand Up @@ -138,14 +140,15 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Construct a Deep Neural Network Classifier"
"# Construct a deep neural network classifier"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Complete Neural Network Source Code \n",
"## Complete neural network source code \n",
"\n",
"Here is the full code for the neural network classifier:"
]
},
Expand All @@ -155,7 +158,7 @@
"metadata": {},
"outputs": [],
"source": [
"!cat \"iris_dnn_classifier.py\""
"!cat \"/home/ec2-user/sample-notebooks/sagemaker-python-sdk/tensorflow_iris_dnn_classifier_using_estimators/iris_dnn_classifier.py\""
]
},
{
Expand All @@ -170,7 +173,7 @@
"metadata": {},
"source": [
"### Using a tf.estimator in SageMaker\n",
"Using an estimator in SageMaker is very easy, you can create one with few lines of code:"
"Using a TensorFlow estimator in SageMaker is very easy, you can create one with few lines of code:"
]
},
{
Expand Down Expand Up @@ -245,7 +248,7 @@
"source": [
"### Describe the serving input pipeline:\n",
"\n",
"After traininng your model, SageMaker will host it in a tensorflow serving. You need to describe a serving input function:"
"After traininng your model, SageMaker will host it in a TensorFlow serving. You need to describe a serving input function:"
]
},
{
Expand All @@ -270,7 +273,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Train a Model on Amazon SageMaker Using TensorFlow Custom Code\n",
"# Train a Model on Amazon SageMaker using TensorFlow custom code\n",
"\n",
"We can use the SDK to run our local training script on SageMaker infrastructure.\n",
"\n",
Expand All @@ -292,8 +295,8 @@
" code_location=custom_code_upload_location,\n",
" train_instance_count=1,\n",
" train_instance_type='ml.c4.xlarge',\n",
" hyperparameters={'training_steps': 100})\n",
"\n"
" training_steps=1000,\n",
" evaluation_steps=100)"
]
},
{
Expand All @@ -305,6 +308,7 @@
"%%time\n",
"import boto3\n",
"\n",
"# use the region-specific sample data bucket\n",
"region = boto3.Session().region_name\n",
"train_data_location = 's3://sagemaker-sample-data-{}/tensorflow/iris'.format(region)\n",
"\n",
Expand All @@ -315,7 +319,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Deploy the Trained Model \n",
"# Deploy the trained Model \n",
"\n",
"The deploy() method creates an endpoint which serves prediction requests in real-time."
]
Expand All @@ -329,7 +333,6 @@
"outputs": [],
"source": [
"%%time\n",
"\n",
"iris_predictor = iris_estimator.deploy(initial_instance_count=1,\n",
" instance_type='ml.c4.xlarge')"
]
Expand All @@ -338,7 +341,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Invoke the Endpoint to Get Inferences"
"# Invoke the Endpoint to get inferences"
]
},
{
Expand Down Expand Up @@ -366,7 +369,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# (Optional) Delete the Endpoint"
"# (Optional) Delete the Endpoint\n",
"\n",
"After you have finished with this example, remember to delete the prediction endpoint to release the instance(s) associated with it."
]
},
{
Expand Down Expand Up @@ -398,7 +403,6 @@
}
],
"metadata": {
"notice": "Copyright 2017 Amazon.com, Inc. or its affiliates. All Rights Reserved. Licensed under the Apache License, Version 2.0 (the \"License\"). You may not use this file except in compliance with the License. A copy of the License is located at http://aws.amazon.com/apache2.0/ or in the \"license\" file accompanying this file. This file is distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.",
"kernelspec": {
"display_name": "Environment (conda_tensorflow_p27)",
"language": "python",
Expand All @@ -415,7 +419,8 @@
"nbconvert_exporter": "python",
"pygments_lexer": "ipython2",
"version": "2.7.14"
}
},
"notice": "Copyright 2017 Amazon.com, Inc. or its affiliates. All Rights Reserved. Licensed under the Apache License, Version 2.0 (the \"License\"). You may not use this file except in compliance with the License. A copy of the License is located at http://aws.amazon.com/apache2.0/ or in the \"license\" file accompanying this file. This file is distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License."
},
"nbformat": 4,
"nbformat_minor": 1
Expand Down