Skip to content
This repository was archived by the owner on May 23, 2024. It is now read-only.

Modify EI image repository and tag to match Python SDK. #12

Merged
merged 2 commits into from
Mar 7, 2019
Merged

Modify EI image repository and tag to match Python SDK. #12

merged 2 commits into from
Mar 7, 2019

Conversation

chuyang-deng
Copy link
Contributor

@chuyang-deng chuyang-deng commented Mar 5, 2019

Issue #, if available:

Description of changes:
Update image repository name and tag to be consistent with Python SDK.
https://github.com/aws/sagemaker-python-sdk/blob/master/src/sagemaker/fw_utils.py#L97

An EI image for TensorFlow looks like: 520713654638.dkr.ecr.us-west-2.amazonaws.com/sagemaker-tensorflow-eia:1.12-cpu-py2

The ECR repository it gets mapped to is sagemaker-tensorflow-eia, as it attaches eia at the end of non-eia repository, thus in this case should sagemaker-tensorflow-serving-eia.

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

@jesterhazy
Copy link
Contributor

AWS CodeBuild CI Report

  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

Copy link

@ChoiByungWook ChoiByungWook left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice!

@@ -39,6 +39,22 @@ function get_tfs_executable() {
rm ${zip_file} && rm -rf exec_dir
}

function get_device_type() {

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it would be less confusing to explicitly show that providing EI results in cpu being selected here.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

user has to provide eia/gpu/cpu arg to script so is this method even needed?

README.md Outdated
@@ -119,10 +119,10 @@ To test against Elastic Inference with Accelerator, you will need an AWS account
For example:

pytest test/integrationsagemaker/test_elastic_inference.py --aws-id 0123456789012 \

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is a / missing in between integration and sagemaker.

README.md Outdated
@@ -119,10 +119,10 @@ To test against Elastic Inference with Accelerator, you will need an AWS account
For example:

pytest test/integrationsagemaker/test_elastic_inference.py --aws-id 0123456789012 \
--docker-base-name sagemaker-tensorflow-serving \
--docker-base-name sagemaker-tensorflow-serving-eia \
--instance_type ml.m4.xlarge \
--accelerator-type ml.eia1.large \

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let's default this to a medium for now.

}

function get_repository_name() {
if [ $1 = 'ei' ]; then
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if we make user pass 'eia' instead of 'ei', then this is just an if/append

@@ -39,6 +39,22 @@ function get_tfs_executable() {
rm ${zip_file} && rm -rf exec_dir
}

function get_device_type() {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

user has to provide eia/gpu/cpu arg to script so is this method even needed?

@jesterhazy
Copy link
Contributor

AWS CodeBuild CI Report

  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@chuyang-deng chuyang-deng merged commit 8975282 into aws:master Mar 7, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants