Skip to content

Commit 5682c42

Browse files
pintaoz-awspintaoz
andauthored
Add framework_version to all TensorFlowModel examples (aws#5038)
* Add framework_version to all TensorFlowModel examples * update framework_version to x.x.x --------- Co-authored-by: pintaoz <[email protected]>
1 parent 9e44f84 commit 5682c42

File tree

2 files changed

+10
-7
lines changed

2 files changed

+10
-7
lines changed

doc/frameworks/tensorflow/deploying_tensorflow_serving.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -64,7 +64,7 @@ If you already have existing model artifacts in S3, you can skip training and de
6464
6565
from sagemaker.tensorflow import TensorFlowModel
6666
67-
model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz', role='MySageMakerRole')
67+
model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz', role='MySageMakerRole', framework_version='x.x.x')
6868
6969
predictor = model.deploy(initial_instance_count=1, instance_type='ml.c5.xlarge')
7070
@@ -74,7 +74,7 @@ Python-based TensorFlow serving on SageMaker has support for `Elastic Inference
7474
7575
from sagemaker.tensorflow import TensorFlowModel
7676
77-
model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz', role='MySageMakerRole')
77+
model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz', role='MySageMakerRole', framework_version='x.x.x')
7878
7979
predictor = model.deploy(initial_instance_count=1, instance_type='ml.c5.xlarge', accelerator_type='ml.eia1.medium')
8080

doc/frameworks/tensorflow/using_tf.rst

Lines changed: 8 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -468,7 +468,7 @@ If you already have existing model artifacts in S3, you can skip training and de
468468
469469
from sagemaker.tensorflow import TensorFlowModel
470470
471-
model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz', role='MySageMakerRole')
471+
model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz', role='MySageMakerRole', framework_version='x.x.x')
472472
473473
predictor = model.deploy(initial_instance_count=1, instance_type='ml.c5.xlarge')
474474
@@ -478,7 +478,7 @@ Python-based TensorFlow serving on SageMaker has support for `Elastic Inference
478478
479479
from sagemaker.tensorflow import TensorFlowModel
480480
481-
model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz', role='MySageMakerRole')
481+
model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz', role='MySageMakerRole', framework_version='x.x.x')
482482
483483
predictor = model.deploy(initial_instance_count=1, instance_type='ml.c5.xlarge', accelerator_type='ml.eia1.medium')
484484
@@ -767,7 +767,8 @@ This customized Python code must be named ``inference.py`` and is specified thro
767767
768768
model = TensorFlowModel(entry_point='inference.py',
769769
model_data='s3://mybucket/model.tar.gz',
770-
role='MySageMakerRole')
770+
role='MySageMakerRole',
771+
framework_version='x.x.x')
771772
772773
In the example above, ``inference.py`` is assumed to be a file inside ``model.tar.gz``. If you want to use a local file instead, you must add the ``source_dir`` argument. See the documentation on `TensorFlowModel <https://sagemaker.readthedocs.io/en/stable/frameworks/tensorflow/sagemaker.tensorflow.html#sagemaker.tensorflow.model.TensorFlowModel>`_.
773774

@@ -923,7 +924,8 @@ processing. There are 2 ways to do this:
923924
model = TensorFlowModel(entry_point='inference.py',
924925
dependencies=['requirements.txt'],
925926
model_data='s3://mybucket/model.tar.gz',
926-
role='MySageMakerRole')
927+
role='MySageMakerRole',
928+
framework_version='x.x.x')
927929
928930
929931
2. If you are working in a network-isolation situation or if you don't
@@ -941,7 +943,8 @@ processing. There are 2 ways to do this:
941943
model = TensorFlowModel(entry_point='inference.py',
942944
dependencies=['/path/to/folder/named/lib'],
943945
model_data='s3://mybucket/model.tar.gz',
944-
role='MySageMakerRole')
946+
role='MySageMakerRole',
947+
framework_version='x.x.x')
945948
946949
For more information, see: https://github.com/aws/sagemaker-tensorflow-serving-container#prepost-processing
947950

0 commit comments

Comments
 (0)