Skip to content

Commit 39426e8

Browse files
author
Chuyang Deng
committed
Remove TF container reference.
1 parent aa17e3d commit 39426e8

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

sagemaker-python-sdk/tensorflow_serving_using_elastic_inference_with_your_own_model/tensorflow_serving_pretrained_model_elastic_inference.ipynb

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@
3838
"\n",
3939
"The pre-trained model we will be using for this example is a NCHW ResNet-50 model from the [official Tensorflow model Github repository](https://github.com/tensorflow/models/tree/master/official/resnet#pre-trained-model). For more information in regards to deep residual networks, please check [here](https://github.com/tensorflow/models/tree/master/official/resnet). It isn't a requirement to train our model on SageMaker to use SageMaker for serving our model.\n",
4040
"\n",
41-
"SageMaker expects our models to be compressed in a tar.gz format in S3. Thankfully, our model already comes in that format. The predefined SageMaker TensorFlow containers utilize TensorFlow serving for loading and handling inferences, so it is expecting a SavedModel. The [predefined SageMaker Tensorflow](https://github.com/aws/sagemaker-tensorflow-container/blob/master/src/tf_container/serve.py#L108) container expects the base folder name for the saved model to be `export/Servo` followed by a verison and your SavedModel artifacts.\n",
41+
"SageMaker expects our models to be compressed in a tar.gz format in S3. Thankfully, our model already comes in that format.\n",
4242
"\n",
4343
"To host our model for inferences in SageMaker, we need to first upload the SavedModel to S3. This can be done through the AWS console or AWS command line.\n",
4444
"\n",
@@ -92,7 +92,7 @@
9292
"There are a few parameters that our TensorFlow Serving Model is expecting.\n",
9393
"1. `model_data` - The S3 location of a model tar.gz file to load in SageMaker\n",
9494
"2. `role` - An IAM role name or ARN for SageMaker to access AWS resources on your behalf.\n",
95-
"3. `framework_version` - TensorFlow serving version you want to use for handling your inference request .\n"
95+
"3. `framework_version` - TensorFlow Serving version you want to use for handling your inference request .\n"
9696
]
9797
},
9898
{
@@ -103,9 +103,9 @@
103103
"source": [
104104
"from sagemaker.tensorflow.serving import Model\n",
105105
"\n",
106-
"tensorflow_model = TensorFlowModel(model_data=saved_model,\n",
107-
" role=role,\n",
108-
" framework_version='1.12')"
106+
"tensorflow_model = Model(model_data=saved_model,\n",
107+
" role=role,\n",
108+
" framework_version='1.12')"
109109
]
110110
},
111111
{

0 commit comments

Comments
 (0)