Skip to content

Commit df7154c

Browse files
author
Raghav Dhall
committed
documentation: minor style changes to pass codebuild
1 parent b631cd5 commit df7154c

32 files changed

+139
-143
lines changed

doc/algorithms/other/index.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,4 +7,4 @@ Other
77
.. toctree::
88
:maxdepth: 2
99

10-
sagemaker.amazon.amazon_estimator
10+
sagemaker.amazon.amazon_estimator

doc/algorithms/tabular/autogluon.rst

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
AutoGluon
33
############
44

5-
`AutoGluon-Tabular <https://auto.gluon.ai/stable/index.html>`__ is a popular open-source AutoML framework that trains highly accurate machine learning models on an unprocessed tabular dataset.
5+
`AutoGluon-Tabular <https://auto.gluon.ai/stable/index.html>`__ is a popular open-source AutoML framework that trains highly accurate machine learning models on an unprocessed tabular dataset.
66
Unlike existing AutoML frameworks that primarily focus on model and hyperparameter selection, AutoGluon-Tabular succeeds by ensembling multiple models and stacking them in multiple layers.
77

88

@@ -20,9 +20,9 @@ The following table outlines a variety of sample notebooks that address differen
2020
- This notebook demonstrates the use of the Amazon SageMaker AutoGluon-Tabular algorithm to train and host a tabular regression model.
2121

2222

23-
For instructions on how to create and access Jupyter notebook instances that you can use to run the example in SageMaker, see
24-
`Use Amazon SageMaker Notebook Instances <https://docs.aws.amazon.com/sagemaker/latest/dg/nbi.html>`__. After you have created a notebook
25-
instance and opened it, choose the SageMaker Examples tab to see a list of all of the SageMaker samples. To open a notebook, choose its
23+
For instructions on how to create and access Jupyter notebook instances that you can use to run the example in SageMaker, see
24+
`Use Amazon SageMaker Notebook Instances <https://docs.aws.amazon.com/sagemaker/latest/dg/nbi.html>`__. After you have created a notebook
25+
instance and opened it, choose the SageMaker Examples tab to see a list of all of the SageMaker samples. To open a notebook, choose its
2626
Use tab and choose Create copy.
2727

2828
For detailed documentation, please refer to the `Sagemaker AutoGluon-Tabular Algorithm <https://docs.aws.amazon.com/sagemaker/latest/dg/autogluon-tabular.html>`__.

doc/algorithms/tabular/catboost.rst

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,8 @@ CatBoost
33
############
44

55

6-
`CatBoost <https://catboost.ai/>`__ is a popular and high-performance open-source implementation of the Gradient Boosting Decision Tree (GBDT)
7-
algorithm. GBDT is a supervised learning algorithm that attempts to accurately predict a target variable by combining an ensemble of
6+
`CatBoost <https://catboost.ai/>`__ is a popular and high-performance open-source implementation of the Gradient Boosting Decision Tree (GBDT)
7+
algorithm. GBDT is a supervised learning algorithm that attempts to accurately predict a target variable by combining an ensemble of
88
estimates from a set of simpler and weaker models.
99

1010
CatBoost introduces two critical algorithmic advances to GBDT:
@@ -13,7 +13,7 @@ CatBoost introduces two critical algorithmic advances to GBDT:
1313

1414
* An innovative algorithm for processing categorical features
1515

16-
Both techniques were created to fight a prediction shift caused by a special kind of target leakage present in all currently existing
16+
Both techniques were created to fight a prediction shift caused by a special kind of target leakage present in all currently existing
1717
implementations of gradient boosting algorithms.
1818

1919
The following table outlines a variety of sample notebooks that address different use cases of Amazon SageMaker CatBoost algorithm.
@@ -27,11 +27,11 @@ The following table outlines a variety of sample notebooks that address differen
2727
* - `Tabular classification with Amazon SageMaker LightGBM and CatBoost algorithm <https://github.com/aws/amazon-sagemaker-examples/blob/main/introduction_to_amazon_algorithms/lightgbm_catboost_tabular/Amazon_Tabular_Classification_LightGBM_CatBoost.ipynb>`__
2828
- This notebook demonstrates the use of the Amazon SageMaker CatBoost algorithm to train and host a tabular classification model.
2929
* - `Tabular regression with Amazon SageMaker LightGBM and CatBoost algorithm <https://github.com/aws/amazon-sagemaker-examples/blob/main/introduction_to_amazon_algorithms/lightgbm_catboost_tabular/Amazon_Tabular_Regression_LightGBM_CatBoost.ipynb>`__
30-
- This notebook demonstrates the use of the Amazon SageMaker CatBoost algorithm to train and host a tabular regression model.
30+
- This notebook demonstrates the use of the Amazon SageMaker CatBoost algorithm to train and host a tabular regression model.
3131

32-
For instructions on how to create and access Jupyter notebook instances that you can use to run the example in SageMaker, see
33-
`Use Amazon SageMaker Notebook Instances <https://docs.aws.amazon.com/sagemaker/latest/dg/nbi.html>`__. After you have created a notebook
34-
instance and opened it, choose the SageMaker Examples tab to see a list of all of the SageMaker samples. To open a notebook, choose its
32+
For instructions on how to create and access Jupyter notebook instances that you can use to run the example in SageMaker, see
33+
`Use Amazon SageMaker Notebook Instances <https://docs.aws.amazon.com/sagemaker/latest/dg/nbi.html>`__. After you have created a notebook
34+
instance and opened it, choose the SageMaker Examples tab to see a list of all of the SageMaker samples. To open a notebook, choose its
3535
Use tab and choose Create copy.
3636

3737
For detailed documentation, please refer to the `Sagemaker CatBoost Algorithm <https://docs.aws.amazon.com/sagemaker/latest/dg/catboost.html>`__.

doc/algorithms/tabular/lightgbm.rst

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -2,10 +2,10 @@
22
LightGBM
33
############
44

5-
`LightGBM <https://lightgbm.readthedocs.io/en/latest/>`__ is a popular and efficient open-source implementation of the Gradient Boosting
6-
Decision Tree (GBDT) algorithm. GBDT is a supervised learning algorithm that attempts to accurately predict a target variable by
7-
combining an ensemble of estimates from a set of simpler and weaker models. LightGBM uses additional techniques to significantly improve
8-
the efficiency and scalability of conventional GBDT.
5+
`LightGBM <https://lightgbm.readthedocs.io/en/latest/>`__ is a popular and efficient open-source implementation of the Gradient Boosting
6+
Decision Tree (GBDT) algorithm. GBDT is a supervised learning algorithm that attempts to accurately predict a target variable by
7+
combining an ensemble of estimates from a set of simpler and weaker models. LightGBM uses additional techniques to significantly improve
8+
the efficiency and scalability of conventional GBDT.
99

1010
The following table outlines a variety of sample notebooks that address different use cases of Amazon SageMaker LightGBM algorithm.
1111

@@ -16,13 +16,13 @@ The following table outlines a variety of sample notebooks that address differen
1616
* - Notebook Title
1717
- Description
1818
* - `Tabular classification with Amazon SageMaker LightGBM and CatBoost algorithm <https://github.com/aws/amazon-sagemaker-examples/blob/main/introduction_to_amazon_algorithms/lightgbm_catboost_tabular/Amazon_Tabular_Classification_LightGBM_CatBoost.ipynb>`__
19-
- This notebook demonstrates the use of the Amazon SageMaker LightGBM algorithm to train and host a tabular classification model.
19+
- This notebook demonstrates the use of the Amazon SageMaker LightGBM algorithm to train and host a tabular classification model.
2020
* - `Tabular regression with Amazon SageMaker LightGBM and CatBoost algorithm <https://github.com/aws/amazon-sagemaker-examples/blob/main/introduction_to_amazon_algorithms/lightgbm_catboost_tabular/Amazon_Tabular_Regression_LightGBM_CatBoost.ipynb>`__
21-
- This notebook demonstrates the use of the Amazon SageMaker LightGBM algorithm to train and host a tabular regression model.
21+
- This notebook demonstrates the use of the Amazon SageMaker LightGBM algorithm to train and host a tabular regression model.
2222

23-
For instructions on how to create and access Jupyter notebook instances that you can use to run the example in SageMaker, see
24-
`Use Amazon SageMaker Notebook Instances <https://docs.aws.amazon.com/sagemaker/latest/dg/nbi.html>`__. After you have created a notebook
25-
instance and opened it, choose the SageMaker Examples tab to see a list of all of the SageMaker samples. To open a notebook, choose its
23+
For instructions on how to create and access Jupyter notebook instances that you can use to run the example in SageMaker, see
24+
`Use Amazon SageMaker Notebook Instances <https://docs.aws.amazon.com/sagemaker/latest/dg/nbi.html>`__. After you have created a notebook
25+
instance and opened it, choose the SageMaker Examples tab to see a list of all of the SageMaker samples. To open a notebook, choose its
2626
Use tab and choose Create copy.
2727

2828
For detailed documentation, please refer to the `Sagemaker LightGBM Algorithm <https://docs.aws.amazon.com/sagemaker/latest/dg/lightgbm.html>`__.

doc/algorithms/tabular/tabtransformer.rst

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -2,8 +2,8 @@
22
TabTransformer
33
###############
44

5-
`TabTransformer <https://arxiv.org/abs/2012.06678>`__ is a novel deep tabular data modeling architecture for supervised learning. The TabTransformer architecture is built on self-attention-based Transformers.
6-
The Transformer layers transform the embeddings of categorical features into robust contextual embeddings to achieve higher prediction accuracy. Furthermore, the contextual embeddings learned from TabTransformer
5+
`TabTransformer <https://arxiv.org/abs/2012.06678>`__ is a novel deep tabular data modeling architecture for supervised learning. The TabTransformer architecture is built on self-attention-based Transformers.
6+
The Transformer layers transform the embeddings of categorical features into robust contextual embeddings to achieve higher prediction accuracy. Furthermore, the contextual embeddings learned from TabTransformer
77
are highly robust against both missing and noisy data features, and provide better interpretability.
88

99

@@ -16,13 +16,13 @@ The following table outlines a variety of sample notebooks that address differen
1616
* - Notebook Title
1717
- Description
1818
* - `Tabular classification with Amazon SageMaker TabTransformer algorithm <https://github.com/aws/amazon-sagemaker-examples/blob/main/introduction_to_amazon_algorithms/tabtransformer_tabular/Amazon_Tabular_Classification_TabTransformer.ipynb>`__
19-
- This notebook demonstrates the use of the Amazon SageMaker TabTransformer algorithm to train and host a tabular classification model.
19+
- This notebook demonstrates the use of the Amazon SageMaker TabTransformer algorithm to train and host a tabular classification model.
2020
* - `Tabular regression with Amazon SageMaker TabTransformer algorithm <https://github.com/aws/amazon-sagemaker-examples/blob/main/introduction_to_amazon_algorithms/tabtransformer_tabular/Amazon_Tabular_Regression_TabTransformer.ipynb>`__
21-
- This notebook demonstrates the use of the Amazon SageMaker TabTransformer algorithm to train and host a tabular regression model.
21+
- This notebook demonstrates the use of the Amazon SageMaker TabTransformer algorithm to train and host a tabular regression model.
2222

23-
For instructions on how to create and access Jupyter notebook instances that you can use to run the example in SageMaker, see
24-
`Use Amazon SageMaker Notebook Instances <https://docs.aws.amazon.com/sagemaker/latest/dg/nbi.html>`__. After you have created a notebook
25-
instance and opened it, choose the SageMaker Examples tab to see a list of all of the SageMaker samples. To open a notebook, choose its
23+
For instructions on how to create and access Jupyter notebook instances that you can use to run the example in SageMaker, see
24+
`Use Amazon SageMaker Notebook Instances <https://docs.aws.amazon.com/sagemaker/latest/dg/nbi.html>`__. After you have created a notebook
25+
instance and opened it, choose the SageMaker Examples tab to see a list of all of the SageMaker samples. To open a notebook, choose its
2626
Use tab and choose Create copy.
2727

28-
For detailed documentation, please refer to the `Sagemaker TabTransformer Algorithm <https://docs.aws.amazon.com/sagemaker/latest/dg/tabtransformer.html>`__.
28+
For detailed documentation, please refer to the `Sagemaker TabTransformer Algorithm <https://docs.aws.amazon.com/sagemaker/latest/dg/tabtransformer.html>`__.

doc/algorithms/tabular/xgboost.rst

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,11 @@
22
XGBoost
33
############
44

5-
The `XGBoost <https://github.com/dmlc/xgboost>`__ (eXtreme Gradient Boosting) is a popular and efficient open-source implementation of the gradient boosted trees algorithm. Gradient boosting is a supervised learning algorithm that attempts to accurately predict a target variable
6-
by combining an ensemble of estimates from a set of simpler and weaker models. The XGBoost algorithm performs well in machine learning competitions because of its robust handling of a variety of data types, relationships, distributions, and the variety of hyperparameters that you can
5+
The `XGBoost <https://github.com/dmlc/xgboost>`__ (eXtreme Gradient Boosting) is a popular and efficient open-source implementation of the gradient boosted trees algorithm. Gradient boosting is a supervised learning algorithm that attempts to accurately predict a target variable
6+
by combining an ensemble of estimates from a set of simpler and weaker models. The XGBoost algorithm performs well in machine learning competitions because of its robust handling of a variety of data types, relationships, distributions, and the variety of hyperparameters that you can
77
fine-tune. You can use XGBoost for regression, classification (binary and multiclass), and ranking problems.
88

9-
You can use the new release of the XGBoost algorithm either as a Amazon SageMaker built-in algorithm or as a framework to run training scripts in your local environments. This implementation has a smaller memory footprint, better logging, improved hyperparameter validation, and
9+
You can use the new release of the XGBoost algorithm either as a Amazon SageMaker built-in algorithm or as a framework to run training scripts in your local environments. This implementation has a smaller memory footprint, better logging, improved hyperparameter validation, and
1010
an expanded set of metrics than the original versions. It provides an XGBoost estimator that executes a training script in a managed XGBoost environment. The current release of SageMaker XGBoost is based on the original XGBoost versions 1.0, 1.2, 1.3, and 1.5.
1111

1212
The following table outlines a variety of sample notebooks that address different use cases of Amazon SageMaker XGBoost algorithm.
@@ -18,7 +18,7 @@ The following table outlines a variety of sample notebooks that address differen
1818
* - Notebook Title
1919
- Description
2020
* - `How to Create a Custom XGBoost container? <https://sagemaker-examples.readthedocs.io/en/latest/aws_sagemaker_studio/sagemaker_studio_image_build/xgboost_bring_your_own/Batch_Transform_BYO_XGB.html>`__
21-
- This notebook shows you how to build a custom XGBoost Container with Amazon SageMaker Batch Transform.
21+
- This notebook shows you how to build a custom XGBoost Container with Amazon SageMaker Batch Transform.
2222
* - `Regression with XGBoost using Parquet <https://sagemaker-examples.readthedocs.io/en/latest/introduction_to_amazon_algorithms/xgboost_abalone/xgboost_parquet_input_training.html>`__
2323
- This notebook shows you how to use the Abalone dataset in Parquet to train a XGBoost model.
2424
* - `How to Train and Host a Multiclass Classification Model? <https://sagemaker-examples.readthedocs.io/en/latest/introduction_to_amazon_algorithms/xgboost_mnist/xgboost_mnist.html>`__
@@ -32,9 +32,9 @@ The following table outlines a variety of sample notebooks that address differen
3232
* - `How to use Amazon SageMaker Debugger to debug XGBoost Training Jobs in Real-Time? <https://sagemaker-examples.readthedocs.io/en/latest/sagemaker-debugger/xgboost_realtime_analysis/xgboost-realtime-analysis.html>`__
3333
- This notebook shows you how to use the MNIST dataset and Amazon SageMaker Debugger to perform real-time analysis of XGBoost training jobs while training jobs are running.
3434

35-
For instructions on how to create and access Jupyter notebook instances that you can use to run the example in SageMaker, see
36-
`Use Amazon SageMaker Notebook Instances <https://docs.aws.amazon.com/sagemaker/latest/dg/nbi.html>`__. After you have created a notebook
37-
instance and opened it, choose the SageMaker Examples tab to see a list of all of the SageMaker samples. To open a notebook, choose its
35+
For instructions on how to create and access Jupyter notebook instances that you can use to run the example in SageMaker, see
36+
`Use Amazon SageMaker Notebook Instances <https://docs.aws.amazon.com/sagemaker/latest/dg/nbi.html>`__. After you have created a notebook
37+
instance and opened it, choose the SageMaker Examples tab to see a list of all of the SageMaker samples. To open a notebook, choose its
3838
Use tab and choose Create copy.
3939

4040
For detailed documentation, please refer to the `Sagemaker XGBoost Algorithm <https://docs.aws.amazon.com/sagemaker/latest/dg/xgboost.html>`__.

0 commit comments

Comments
 (0)