Skip to content

Commit 08817f3

Browse files
author
Chuyang Deng
committed
Add TFS container reference.
1 parent 39426e8 commit 08817f3

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

sagemaker-python-sdk/tensorflow_serving_using_elastic_inference_with_your_own_model/tensorflow_serving_pretrained_model_elastic_inference.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@
3838
"\n",
3939
"The pre-trained model we will be using for this example is a NCHW ResNet-50 model from the [official Tensorflow model Github repository](https://github.com/tensorflow/models/tree/master/official/resnet#pre-trained-model). For more information in regards to deep residual networks, please check [here](https://github.com/tensorflow/models/tree/master/official/resnet). It isn't a requirement to train our model on SageMaker to use SageMaker for serving our model.\n",
4040
"\n",
41-
"SageMaker expects our models to be compressed in a tar.gz format in S3. Thankfully, our model already comes in that format.\n",
41+
"SageMaker expects our models to be compressed in a tar.gz format in S3. Thankfully, our model already comes in that format. The predefined TensorFlow Serving containers use REST API for handling inferences, for more informationm, please see [predefined SageMaker TensorFlow Serving](https://github.com/aws/sagemaker-tensorflow-serving-container/blob/master/container/sagemaker/serve.py).\n",
4242
"\n",
4343
"To host our model for inferences in SageMaker, we need to first upload the SavedModel to S3. This can be done through the AWS console or AWS command line.\n",
4444
"\n",

0 commit comments

Comments
 (0)