Skip to content

Remove p3 usage from the notebooks. #308

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jul 10, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
"\n",
"## Setup\n",
"\n",
"_This notebook was created and tested on an ml.p3.2xlarge notebook instance._\n",
"_This notebook was created and tested on an ml.p2.xlarge notebook instance._\n",
"\n",
"Let's start by creating a SageMaker session and specifying:\n",
"\n",
Expand Down Expand Up @@ -171,7 +171,7 @@
"metadata": {},
"source": [
"### Run training in SageMaker\n",
"The PyTorch class allows us to run our training function as a training job on SageMaker infrastructure. We need to configure it with our training script and source directory, an IAM role, the number of training instances, and the training instance type. In this case we will run our training job on ml.p3.2xlarge instance. As you can see in this example you can also specify hyperparameters. "
"The PyTorch class allows us to run our training function as a training job on SageMaker infrastructure. We need to configure it with our training script and source directory, an IAM role, the number of training instances, and the training instance type. In this case we will run our training job on ```ml.p2.xlarge``` instance. As you can see in this example you can also specify hyperparameters. "
]
},
{
Expand All @@ -186,7 +186,7 @@
" role=role,\n",
" framework_version='0.4.0',\n",
" train_instance_count=1,\n",
" train_instance_type='ml.p3.2xlarge',\n",
" train_instance_type='ml.p2.xlarge',\n",
" source_dir='source',\n",
" # available hyperparameters: emsize, nhid, nlayers, lr, clip, epochs, batch_size,\n",
" # bptt, dropout, tied, seed, log_interval\n",
Expand Down
4 changes: 2 additions & 2 deletions sagemaker-python-sdk/pytorch_mnist/pytorch_mnist.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -143,7 +143,7 @@
"source": [
"### Run training in SageMaker\n",
"\n",
"The `PyTorch` class allows us to run our training function as a training job on SageMaker infrastructure. We need to configure it with our training script, an IAM role, the number of training instances, the training instance type, and hyperparameters. In this case we are going to run our training job on 2 ```ml.p3.2xlarge``` instances. But this example can be ran on one or multiple, cpu or gpu instances ([full list of available instances](https://aws.amazon.com/sagemaker/pricing/instance-types/)). The hyperparameters parameter is a dict of values that will be passed to your training script -- you can see how to access these values in the `mnist.py` script above."
"The `PyTorch` class allows us to run our training function as a training job on SageMaker infrastructure. We need to configure it with our training script, an IAM role, the number of training instances, the training instance type, and hyperparameters. In this case we are going to run our training job on 2 ```ml.c4.xlarge``` instances. But this example can be ran on one or multiple, cpu or gpu instances ([full list of available instances](https://aws.amazon.com/sagemaker/pricing/instance-types/)). The hyperparameters parameter is a dict of values that will be passed to your training script -- you can see how to access these values in the `mnist.py` script above."
]
},
{
Expand All @@ -158,7 +158,7 @@
" role=role,\n",
" framework_version='0.4.0',\n",
" train_instance_count=2,\n",
" train_instance_type='ml.p3.2xlarge',\n",
" train_instance_type='ml.c4.xlarge',\n",
" hyperparameters={\n",
" 'epochs': 6,\n",
" 'backend': 'gloo'\n",
Expand Down