Skip to content

Commit 5a91d18

Browse files
eslesar-awsEliza Zhang
authored andcommitted
Edited the tf script mode notebook (aws#90)
* edited tf script mode notebook
1 parent de08ef9 commit 5a91d18

File tree

1 file changed

+93
-4
lines changed

1 file changed

+93
-4
lines changed

examples/script_mode_train_any_tf_script_in_sage_maker.ipynb

Lines changed: 93 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,7 @@
44
"cell_type": "markdown",
55
"metadata": {},
66
"source": [
7+
<<<<<<< HEAD
78
<<<<<<< HEAD
89
"# Use Script Mode to train any TensorFlow script from GitHub in SageMaker\n",
910
"\n",
@@ -12,28 +13,41 @@
1213
"For this example, you use [Multi-layer Recurrent Neural Networks (LSTM, RNN) for character-level language models in Python using Tensorflow](https://github.com/sherjilozair/char-rnn-tensorflow), but you can use the same technique for other scripts or repositories. For example, [TensorFlow Model Zoo](https://github.com/tensorflow/models) and [TensorFlow benchmark scripts](https://github.com/tensorflow/benchmarks/tree/master/scripts/tf_cnn_benchmarks)."
1314
=======
1415
"# Using the Script Mode to train any TensorFlow script from GitHub in SageMaker\n",
16+
=======
17+
"# Use Script Mode to train any TensorFlow script from GitHub in SageMaker\n",
18+
>>>>>>> Edited the tf script mode notebook (#90)
1519
"\n",
16-
"In this tutorial, we show how simple it is to train a TensorFlow script in SageMaker using the new Script Mode Tensorflow Container.\n",
20+
"In this tutorial, you train a TensorFlow script in SageMaker using the new Script Mode Tensorflow Container.\n",
1721
"\n",
22+
<<<<<<< HEAD
1823
"The example we chose is [Multi-layer Recurrent Neural Networks (LSTM, RNN) for character-level language models in Python using Tensorflow](https://github.com/sherjilozair/char-rnn-tensorflow) but this same technique can be used to other scripts or repositories including [TensorFlow Model Zoo](https://github.com/tensorflow/models) and [TensorFlow benchmark scripts](https://github.com/tensorflow/benchmarks/tree/master/scripts/tf_cnn_benchmarks)."
1924
>>>>>>> Add Script Mode example (#83)
25+
=======
26+
"For this example, you use [Multi-layer Recurrent Neural Networks (LSTM, RNN) for character-level language models in Python using Tensorflow](https://github.com/sherjilozair/char-rnn-tensorflow), but you can use the same technique for other scripts or repositories. For example, [TensorFlow Model Zoo](https://github.com/tensorflow/models) and [TensorFlow benchmark scripts](https://github.com/tensorflow/benchmarks/tree/master/scripts/tf_cnn_benchmarks)."
27+
>>>>>>> Edited the tf script mode notebook (#90)
2028
]
2129
},
2230
{
2331
"cell_type": "markdown",
2432
"metadata": {},
2533
"source": [
2634
<<<<<<< HEAD
35+
<<<<<<< HEAD
36+
=======
37+
>>>>>>> Edited the tf script mode notebook (#90)
2738
"## Set up the environment\n",
2839
"Let's start by creating a SageMaker session and specifying the following:\n",
2940
"- The S3 bucket and prefix to use for training and model data. The bucket should be in the same region as the Notebook Instance, training instance(s), and hosting instance(s). This example uses the default bucket that a SageMaker `Session` creates.\n",
3041
"- The IAM role that allows SageMaker services to access your data. For more information about using IAM roles in SageMaker, see [Amazon SageMaker Roles](https://docs.aws.amazon.com/sagemaker/latest/dg/sagemaker-roles.html).\n"
42+
<<<<<<< HEAD
3143
=======
3244
"## Setting up the environment\n",
3345
"Let's start by creating a SageMaker session and specifying:\n",
3446
"- The S3 bucket and prefix that you want to use for training and model data. It should be within the same region as the Notebook Instance, training, and hosting.\n",
3547
"- The IAM role allows SageMaker services to access your data. See the documentation [for how to create these](https://docs.aws.amazon.com/sagemaker/latest/dg/sagemaker-roles.html).\n"
3648
>>>>>>> Add Script Mode example (#83)
49+
=======
50+
>>>>>>> Edited the tf script mode notebook (#90)
3751
]
3852
},
3953
{
@@ -55,12 +69,17 @@
5569
"cell_type": "markdown",
5670
"metadata": {},
5771
"source": [
72+
<<<<<<< HEAD
5873
<<<<<<< HEAD
5974
"### Clone the repository\n",
6075
"Run the following command to clone the repository that contains the example:"
6176
=======
6277
"### Clone the repository"
6378
>>>>>>> Add Script Mode example (#83)
79+
=======
80+
"### Clone the repository\n",
81+
"Run the following command to clone the repository that contains the example:"
82+
>>>>>>> Edited the tf script mode notebook (#90)
6483
]
6584
},
6685
{
@@ -93,12 +112,17 @@
93112
"cell_type": "markdown",
94113
"metadata": {},
95114
"source": [
115+
<<<<<<< HEAD
96116
<<<<<<< HEAD
97117
"### Get the data\n",
98118
"For training data, use plain text versions of Sherlock Holmes stories."
99119
=======
100120
"### Getting the data"
101121
>>>>>>> Add Script Mode example (#83)
122+
=======
123+
"### Get the data\n",
124+
"For training data, use plain text versions of Sherlock Holmes stories."
125+
>>>>>>> Edited the tf script mode notebook (#90)
102126
]
103127
},
104128
{
@@ -115,11 +139,15 @@
115139
"cell_type": "markdown",
116140
"metadata": {},
117141
"source": [
142+
<<<<<<< HEAD
118143
<<<<<<< HEAD
119144
"## Test locally"
120145
=======
121146
"## Testing locally"
122147
>>>>>>> Add Script Mode example (#83)
148+
=======
149+
"## Test locally"
150+
>>>>>>> Edited the tf script mode notebook (#90)
123151
]
124152
},
125153
{
@@ -167,11 +195,15 @@
167195
"metadata": {},
168196
"source": [
169197
"\n",
198+
<<<<<<< HEAD
170199
<<<<<<< HEAD
171200
"Use [Local Mode](https://github.com/aws/sagemaker-python-sdk#local-mode) to run the script locally in the notebook instance before you run a SageMaker training job:"
172201
=======
173202
"We can use [Local Mode](https://github.com/aws/sagemaker-python-sdk#local-mode) to simulate SageMaker locally before submit training:"
174203
>>>>>>> Add Script Mode example (#83)
204+
=======
205+
"Use [Local Mode](https://github.com/aws/sagemaker-python-sdk#local-mode) to run the script locally in the notebook instance before you run a SageMaker training job:"
206+
>>>>>>> Edited the tf script mode notebook (#90)
175207
]
176208
},
177209
{
@@ -188,11 +220,15 @@
188220
"\n",
189221
"estimator = ScriptModeTensorFlow(entry_point='train.py',\n",
190222
" source_dir='char-rnn-tensorflow',\n",
223+
<<<<<<< HEAD
191224
<<<<<<< HEAD
192225
" train_instance_type='local', # Run in local mode\n",
193226
=======
194227
" train_instance_type='local', \n",
195228
>>>>>>> Add Script Mode example (#83)
229+
=======
230+
" train_instance_type='local', # Run in local mode\n",
231+
>>>>>>> Edited the tf script mode notebook (#90)
196232
" train_instance_count=1,\n",
197233
" hyperparameters=hyperparameters,\n",
198234
" role=role)\n",
@@ -206,6 +242,7 @@
206242
"source": [
207243
"## How Script Mode executes the script in the container\n",
208244
"\n",
245+
<<<<<<< HEAD
209246
<<<<<<< HEAD
210247
"The above cell downloads a Python 3 CPU container locally and simulates a SageMaker training job. When training starts, script mode installs the user script as a Python module. The module name matches the script name. In this case, **train.py** is transformed into a Python module named **train**.\n",
211248
"\n",
@@ -215,15 +252,24 @@
215252
"\n",
216253
"After that, the Python interpreter executes the user module, passing **hyperparameters** as script arguments. The example above will be executed as follow:\n",
217254
>>>>>>> Add Script Mode example (#83)
255+
=======
256+
"The above cell downloads a Python 3 CPU container locally and simulates a SageMaker training job. When training starts, script mode installs the user script as a Python module. The module name matches the script name. In this case, **train.py** is transformed into a Python module named **train**.\n",
257+
"\n",
258+
"After that, the Python interpreter executes the user module, passing **hyperparameters** as script arguments. The example above is executed as follows:\n",
259+
>>>>>>> Edited the tf script mode notebook (#90)
218260
"```bash\n",
219261
"python -m train --num-epochs 1 --data-dir /opt/ml/input/data/training --save-dir /opt/ml/model\n",
220262
"```\n",
221263
"\n",
264+
<<<<<<< HEAD
222265
<<<<<<< HEAD
223266
"The **train** module consumes the hyperparameters using any argument parsing library. [The example we're using](https://github.com/sherjilozair/char-rnn-tensorflow/blob/master/train.py#L11) uses the Python [argparse](https://docs.python.org/3/library/argparse.html) library:\n",
224267
=======
225268
"A user provide script consumes the hyperparameters using any argument parsing library, [in the example above](https://github.com/sherjilozair/char-rnn-tensorflow/blob/master/train.py#L11):\n",
226269
>>>>>>> Add Script Mode example (#83)
270+
=======
271+
"The **train** module consumes the hyperparameters using any argument parsing library. [The example we're using](https://github.com/sherjilozair/char-rnn-tensorflow/blob/master/train.py#L11) uses the Python [argparse](https://docs.python.org/3/library/argparse.html) library:\n",
272+
>>>>>>> Edited the tf script mode notebook (#90)
227273
"\n",
228274
"```python\n",
229275
"parser = argparse.ArgumentParser(formatter_class=argparse.ArgumentDefaultsHelpFormatter)\n",
@@ -238,6 +284,7 @@
238284
"\n",
239285
"Let's explain the values of **--data_dir** and **--save-dir**:\n",
240286
"\n",
287+
<<<<<<< HEAD
241288
<<<<<<< HEAD
242289
"- **/opt/ml/input/data/training** is the directory inside the container where the training data is downloaded. The data is downloaded to this folder because **training** is the channel name defined in ```estimator.fit({'training': inputs})```. See [training data](https://docs.aws.amazon.com/sagemaker/latest/dg/your-algorithms-training-algo.html#your-algorithms-training-algo-running-container-trainingdata) for more information. \n",
243290
"\n",
@@ -251,16 +298,23 @@
251298
"For example, the example above can read information about the **training** channel provided in the training job request by adding the environment variable `SM_CHANNEL_TRAINING` as the default value for the `--data_dir` argument:\n",
252299
=======
253300
"- **/opt/ml/input/data/training** is the directory inside the container where the training data is downloaded. The data was downloaded in this folder because **training** is the channel name defined in ```estimator.fit({'training': inputs})```. See [training data](https://docs.aws.amazon.com/sagemaker/latest/dg/your-algorithms-training-algo.html#your-algorithms-training-algo-running-container-trainingdata) for more information. \n",
301+
=======
302+
"- **/opt/ml/input/data/training** is the directory inside the container where the training data is downloaded. The data is downloaded to this folder because **training** is the channel name defined in ```estimator.fit({'training': inputs})```. See [training data](https://docs.aws.amazon.com/sagemaker/latest/dg/your-algorithms-training-algo.html#your-algorithms-training-algo-running-container-trainingdata) for more information. \n",
303+
>>>>>>> Edited the tf script mode notebook (#90)
254304
"\n",
255-
"- **/opt/ml/model** use this directory to save models, checkpoints or any other data. Any data saved in this folder is saved in the S3 bucket defined for training. See [model data](https://docs.aws.amazon.com/sagemaker/latest/dg/your-algorithms-training-algo.html#your-algorithms-training-algo-envvariables) for more information.\n",
305+
"- **/opt/ml/model** use this directory to save models, checkpoints, or any other data. Any data saved in this folder is saved in the S3 bucket defined for training. See [model data](https://docs.aws.amazon.com/sagemaker/latest/dg/your-algorithms-training-algo.html#your-algorithms-training-algo-envvariables) for more information.\n",
256306
"\n",
257307
"### Reading additional information from the container\n",
258308
"\n",
259-
"Very often, a user script needs additional information from the container that is not available in ```hyperparameters```.\n",
260-
"SageMaker Containers writes this information as **environment variables** that are available inside the script.\n",
309+
"Often, a user script needs additional information from the container that is not available in ```hyperparameters```.\n",
310+
"SageMaker containers write this information as **environment variables** that are available inside the script.\n",
261311
"\n",
312+
<<<<<<< HEAD
262313
"For example, the example above can read information about the **training** channel provided in the training job request:\n",
263314
>>>>>>> Add Script Mode example (#83)
315+
=======
316+
"For example, the example above can read information about the **training** channel provided in the training job request by adding the environment variable `SM_CHANNEL_TRAINING` as the default value for the `--data_dir` argument:\n",
317+
>>>>>>> Edited the tf script mode notebook (#90)
264318
"\n",
265319
"```python\n",
266320
"if __name__ == '__main__':\n",
@@ -269,11 +323,15 @@
269323
" parser.add_argument('--data_dir', type=str, default=os.environ['SM_CHANNEL_TRAINING'])\n",
270324
"```\n",
271325
"\n",
326+
<<<<<<< HEAD
272327
<<<<<<< HEAD
273328
"Script mode displays the list of available environment variables in the training logs. You can find the [entire list here](https://github.com/aws/sagemaker-containers/blob/master/README.md#environment-variables-full-specification)."
274329
=======
275330
"Script Mode displays the list of the environment variables available in the training logs. You can find the [entire list here](https://github.com/aws/sagemaker-containers/blob/master/README.md#environment-variables-full-specification)."
276331
>>>>>>> Add Script Mode example (#83)
332+
=======
333+
"Script mode displays the list of available environment variables in the training logs. You can find the [entire list here](https://github.com/aws/sagemaker-containers/blob/master/README.md#environment-variables-full-specification)."
334+
>>>>>>> Edited the tf script mode notebook (#90)
277335
]
278336
},
279337
{
@@ -287,11 +345,15 @@
287345
"cell_type": "markdown",
288346
"metadata": {},
289347
"source": [
348+
<<<<<<< HEAD
290349
<<<<<<< HEAD
291350
"After you test the training job locally, upload the dataset to an S3 bucket so SageMaker can access the data during training.\n"
292351
=======
293352
"We need to upload the dataset to an S3 bucket so SageMaker can access the data during training.\n"
294353
>>>>>>> Add Script Mode example (#83)
354+
=======
355+
"After you test the training job locally, upload the dataset to an S3 bucket so SageMaker can access the data during training.\n"
356+
>>>>>>> Edited the tf script mode notebook (#90)
295357
]
296358
},
297359
{
@@ -307,11 +369,15 @@
307369
"cell_type": "markdown",
308370
"metadata": {},
309371
"source": [
372+
<<<<<<< HEAD
310373
<<<<<<< HEAD
311374
"To train in SageMaker, change the estimator argument **train_instance_type** to any SageMaker ml instance available for training. For example:"
312375
=======
313376
"You can change the estimator argument **train_instance_type** to any SageMaker ml instance available for training. For example:"
314377
>>>>>>> Add Script Mode example (#83)
378+
=======
379+
"To train in SageMaker, change the estimator argument **train_instance_type** to any SageMaker ml instance available for training. For example:"
380+
>>>>>>> Edited the tf script mode notebook (#90)
315381
]
316382
},
317383
{
@@ -348,11 +414,15 @@
348414
"cell_type": "markdown",
349415
"metadata": {},
350416
"source": [
417+
<<<<<<< HEAD
351418
<<<<<<< HEAD
352419
"Script Mode installs the contents of your `source_dir` folder in the container as a [Python package](https://github.com/aws/sagemaker-containers/blob/master/src/sagemaker_containers/_modules.py#L100). You can include a [requirements.txt file in the root folder of your source_dir to install any pip dependencies](https://github.com/aws/sagemaker-containers/blob/master/src/sagemaker_containers/_modules.py#L111). You can, for example, install the lastest version of TensorFlow in the container:\n",
353420
=======
354421
"Script Mode will install your source_dir in the container as a [Python package](https://github.com/aws/sagemaker-containers/blob/master/src/sagemaker_containers/_modules.py#L100). You can include a [requirements.txt file in the root folder of your source_dir to install any pip dependencies](https://github.com/aws/sagemaker-containers/blob/master/src/sagemaker_containers/_modules.py#L111). You can, for example, install the lastest version of tensorflow in the container:\n",
355422
>>>>>>> Add Script Mode example (#83)
423+
=======
424+
"Script Mode installs the contents of your `source_dir` folder in the container as a [Python package](https://github.com/aws/sagemaker-containers/blob/master/src/sagemaker_containers/_modules.py#L100). You can include a [requirements.txt file in the root folder of your source_dir to install any pip dependencies](https://github.com/aws/sagemaker-containers/blob/master/src/sagemaker_containers/_modules.py#L111). You can, for example, install the lastest version of TensorFlow in the container:\n",
425+
>>>>>>> Edited the tf script mode notebook (#90)
356426
"\n",
357427
"content of requirements.txt\n",
358428
"```\n",
@@ -365,11 +435,15 @@
365435
"metadata": {},
366436
"source": [
367437
"# Installing apt-get packages and other dependencies\n",
438+
<<<<<<< HEAD
368439
<<<<<<< HEAD
369440
"You can define a `setup.py` file in your `source_dir` folder to install other dependencies. The example below installs [TensorFlow for C](https://www.tensorflow.org/install/lang_c) in the container."
370441
=======
371442
"You can define a setup.py file in your source_dir to install other dependencies. The example below will install [TensorFlow for C](https://www.tensorflow.org/install/lang_c) in the container."
372443
>>>>>>> Add Script Mode example (#83)
444+
=======
445+
"You can define a `setup.py` file in your `source_dir` folder to install other dependencies. The example below installs [TensorFlow for C](https://www.tensorflow.org/install/lang_c) in the container."
446+
>>>>>>> Edited the tf script mode notebook (#90)
373447
]
374448
},
375449
{
@@ -473,6 +547,7 @@
473547
],
474548
"metadata": {
475549
"kernelspec": {
550+
<<<<<<< HEAD
476551
<<<<<<< HEAD
477552
"display_name": "Python 3",
478553
"language": "python",
@@ -482,27 +557,41 @@
482557
"language": "python",
483558
"name": "python2"
484559
>>>>>>> Add Script Mode example (#83)
560+
=======
561+
"display_name": "Python 3",
562+
"language": "python",
563+
"name": "python3"
564+
>>>>>>> Edited the tf script mode notebook (#90)
485565
},
486566
"language_info": {
487567
"codemirror_mode": {
488568
"name": "ipython",
569+
<<<<<<< HEAD
489570
<<<<<<< HEAD
490571
"version": 3
491572
=======
492573
"version": 2
493574
>>>>>>> Add Script Mode example (#83)
575+
=======
576+
"version": 3
577+
>>>>>>> Edited the tf script mode notebook (#90)
494578
},
495579
"file_extension": ".py",
496580
"mimetype": "text/x-python",
497581
"name": "python",
498582
"nbconvert_exporter": "python",
583+
<<<<<<< HEAD
499584
<<<<<<< HEAD
500585
"pygments_lexer": "ipython3",
501586
"version": "3.6.5"
502587
=======
503588
"pygments_lexer": "ipython2",
504589
"version": "2.7.15"
505590
>>>>>>> Add Script Mode example (#83)
591+
=======
592+
"pygments_lexer": "ipython3",
593+
"version": "3.6.5"
594+
>>>>>>> Edited the tf script mode notebook (#90)
506595
}
507596
},
508597
"nbformat": 4,

0 commit comments

Comments
 (0)