Skip to content

Commit 1d3a97e

Browse files
committed
small pr changes
1 parent 2aff732 commit 1d3a97e

File tree

2 files changed

+10
-10
lines changed

2 files changed

+10
-10
lines changed

sagemaker-python-sdk/tensorflow_distributed_mnist/tensorflow_distributed_mnist.ipynb

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -116,11 +116,11 @@
116116
"- [creates an optimizer and minimizes the loss function to improve the neural network](https://github.com/tensorflow/models/blob/master/official/mnist/mnist.py#L193)\n",
117117
"- [returns the output, optimizer and loss function](https://github.com/tensorflow/models/blob/master/official/mnist/mnist.py#L205)\n",
118118
"\n",
119-
"## Writing writint a ```model_fn``` for distributed training\n",
119+
"## Writing a ```model_fn``` for distributed training\n",
120120
"When distributed training happens, the same neural network will be sent to the multiple training instances. Each instance will predict a batch of the dataset, calculate loss and minimize the optimizer. One entire loop of this process is called **training step**.\n",
121121
"\n",
122122
"### Syncronizing training steps\n",
123-
"A [global step](https://www.tensorflow.org/api_docs/python/tf/train/global_step) it is a global variable shared between the instances. It necessary for distributed training, so the optimizer will keep track of the number of **training steps** between runs: \n",
123+
"A [global step](https://www.tensorflow.org/api_docs/python/tf/train/global_step) is a global variable shared between the instances. It necessary for distributed training, so the optimizer will keep track of the number of **training steps** between runs: \n",
124124
"\n",
125125
"```python\n",
126126
"train_op = optimizer.minimize(loss, tf.train.get_or_create_global_step())\n",
@@ -241,25 +241,25 @@
241241
}
242242
],
243243
"metadata": {
244+
"notice": "Copyright 2017 Amazon.com, Inc. or its affiliates. All Rights Reserved. Licensed under the Apache License, Version 2.0 (the \"License\"). You may not use this file except in compliance with the License. A copy of the License is located at http://aws.amazon.com/apache2.0/ or in the \"license\" file accompanying this file. This file is distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.",
244245
"kernelspec": {
245-
"display_name": "Python 2",
246+
"display_name": "Environment (conda_tensorflow_p27)",
246247
"language": "python",
247-
"name": "python2"
248+
"name": "conda_tensorflow_p27"
248249
},
249250
"language_info": {
250251
"codemirror_mode": {
251252
"name": "ipython",
252-
"version": 2
253+
"version": 3
253254
},
254255
"file_extension": ".py",
255256
"mimetype": "text/x-python",
256257
"name": "python",
257258
"nbconvert_exporter": "python",
258-
"pygments_lexer": "ipython2",
259-
"version": "2.7.10"
260-
},
261-
"notice": "Copyright 2017 Amazon.com, Inc. or its affiliates. All Rights Reserved. Licensed under the Apache License, Version 2.0 (the \"License\"). You may not use this file except in compliance with the License. A copy of the License is located at http://aws.amazon.com/apache2.0/ or in the \"license\" file accompanying this file. This file is distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License."
259+
"pygments_lexer": "ipython3",
260+
"version": "2.7.13"
261+
}
262262
},
263263
"nbformat": 4,
264264
"nbformat_minor": 2
265-
}
265+
}
Binary file not shown.

0 commit comments

Comments
 (0)