Skip to content

Commit fd5d2f6

Browse files
authored
Clean up RTD Website #1 (#3312)
* rst edits * edit introduction_to_amazon_algorithms/blazingtext_hosting_pretrained_fasttext/blazingtext_hosting_pretrained_fasttext.ipynb * edit introduction_to_amazon_algorithms/xgboost_mnist/xgboost_mnist.ipynb * edit aws_sagemaker_studio/sagemaker_studio_image_build/xgboost_bring_your_own/Batch_Transform_BYO_XGB.ipynb * fix autopilot/custom-feature-selection/Feature_selection_autopilot.ipynb * fix advanced_functionality/search/ml_experiment_management_using_search.ipynb * edit ground_truth_labeling_jobs/from_unlabeled_data_to_deployed_machine_learning_model_ground_truth_demo_image_classification/from_unlabeled_data_to_deployed_machine_learning_model_ground_truth_demo_image_classification.ipynb * edit ground_truth_labeling_jobs/3d_point_cloud_input_data_processing/3D-point-cloud-input-data-processing.ipynb * edit ground_truth_labeling_jobs/ground_truth_object_detection_tutorial/object_detection_tutorial.ipynb * edit autopilot/custom-feature-selection/Feature_selection_autopilot.ipynb * edit ground_truth_labeling_jobs/pretrained_model/pretrained_model_labeling_tutorial.ipynb * edit ground_truth_labeling_jobs/bring_your_own_model_for_sagemaker_labeling_workflows_with_active_learning/bring_your_own_model_for_sagemaker_labeling_workflows_with_active_learning.ipynb * edit ground_truth_labeling_jobs/3d_point_cloud_input_data_processing/3D-point-cloud-input-data-processing.ipynb * remove broken link for deleted notebook training/algorithms.rst * edit aws_marketplace/creating_marketplace_products/algorithms/Bring_Your_Own-Creating_Algorithm_and_Model_Package.ipynb * edit aws_marketplace/curating_aws_marketplace_listing_and_sample_notebook/Algorithm/Sample_Notebook_Template/title_of_your_product-Algorithm.ipynb * edit aws_marketplace/curating_aws_marketplace_listing_and_sample_notebook/ModelPackage/Sample_Notebook_Template/title_of_your_product-Model.ipynb * edit aws_marketplace/using_data/image_classification_with_shutterstock_image_datasets/image-classification-with-shutterstock-datasets.ipynb * edit aws_marketplace/using_model_packages/auto_insurance/automating_auto_insurance_claim_processing.ipynb * edit aws_marketplace/using_model_packages/improving_industrial_workplace_safety/improving_industrial_workplace_safety.ipynb * edit aws_marketplace/using_model_packages/generic_sample_notebook/A_generic_sample_notebook_to_perform_inference_on_ML_model_packages_from_AWS_Marketplace.ipynb * aws_marketplace/using_model_packages/creative-writing-using-gpt-2-text-generation/creative-writing-using-gpt-2-text-generation.ipynb * edit aws_marketplace/using_model_packages/amazon_augmented_ai_with_aws_marketplace_ml_models/amazon_augmented_ai_with_aws_marketplace_ml_models.ipynb * edit aws_marketplace/using-open-source-model-packages/pytorch-ic-model/using-image-classification-models.ipynb * edit advanced_functionality/pytorch_extending_our_containers/pytorch_extending_our_containers.ipynb * advanced_functionality/scikit_bring_your_own/scikit_bring_your_own.ipynb * edit advanced_functionality/tensorflow_bring_your_own/tensorflow_bring_your_own.ipynb * edit sagemaker-python-sdk/tensorflow_serving_using_elastic_inference_with_your_own_model/tensorflow_serving_pretrained_model_elastic_inference.ipynb Co-authored-by: EC2 Default User <[email protected]> Ignoring CI as this is a cosmetic change only. Need to comestically change around ~90 notebooks in 5 days to meet a deadline, wit CI, we will not meet this deadline
1 parent 9233ad5 commit fd5d2f6

File tree

30 files changed

+214
-236
lines changed

30 files changed

+214
-236
lines changed

README.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,6 @@ These examples provide a gentle introduction to machine learning concepts as the
4747

4848
- [Targeted Direct Marketing](introduction_to_applying_machine_learning/xgboost_direct_marketing) predicts potential customers that are most likely to convert based on customer and aggregate level metrics, using Amazon SageMaker's implementation of [XGBoost](https://github.com/dmlc/xgboost).
4949
- [Predicting Customer Churn](introduction_to_applying_machine_learning/xgboost_customer_churn) uses customer interaction and service usage data to find those most likely to churn, and then walks through the cost/benefit trade-offs of providing retention incentives. This uses Amazon SageMaker's implementation of [XGBoost](https://github.com/dmlc/xgboost) to create a highly predictive model.
50-
- [Time-series Forecasting](introduction_to_applying_machine_learning/linear_time_series_forecast) generates a forecast for topline product demand using Amazon SageMaker's Linear Learner algorithm.
5150
- [Cancer Prediction](introduction_to_applying_machine_learning/breast_cancer_prediction) predicts Breast Cancer based on features derived from images, using SageMaker's Linear Learner.
5251
- [Ensembling](introduction_to_applying_machine_learning/ensemble_modeling) predicts income using two Amazon SageMaker models to show the advantages in ensembling.
5352
- [Video Game Sales](introduction_to_applying_machine_learning/video_game_sales) develops a binary prediction model for the success of video games based on review scores.

advanced_functionality/pytorch_extending_our_containers/pytorch_extending_our_containers.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@
7777
"cell_type": "markdown",
7878
"metadata": {},
7979
"source": [
80-
"# Part 1: Packaging and Uploading your Algorithm for use with Amazon SageMaker\n",
80+
"## Part 1: Packaging and Uploading your Algorithm for use with Amazon SageMaker\n",
8181
"\n",
8282
"### An overview of Docker\n",
8383
"\n",
@@ -517,7 +517,7 @@
517517
"cell_type": "markdown",
518518
"metadata": {},
519519
"source": [
520-
"# Part 2: Training and Hosting your Algorithm in Amazon SageMaker\n",
520+
"## Part 2: Training and Hosting your Algorithm in Amazon SageMaker\n",
521521
"Once you have your container packaged, you can use it to train and serve models. Let's do that with the algorithm we made above.\n",
522522
"\n",
523523
"## Set up the environment\n",
@@ -684,7 +684,7 @@
684684
"cell_type": "markdown",
685685
"metadata": {},
686686
"source": [
687-
"# Reference\n",
687+
"## Reference\n",
688688
"- [How Amazon SageMaker interacts with your Docker container for training](https://docs.aws.amazon.com/sagemaker/latest/dg/your-algorithms-training-algo.html)\n",
689689
"- [How Amazon SageMaker interacts with your Docker container for inference](https://docs.aws.amazon.com/sagemaker/latest/dg/your-algorithms-inference-code.html)\n",
690690
"- [CIFAR-10 Dataset](https://www.cs.toronto.edu/~kriz/cifar.html)\n",

advanced_functionality/scikit_bring_your_own/scikit_bring_your_own.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@
7777
"cell_type": "markdown",
7878
"metadata": {},
7979
"source": [
80-
"# Part 1: Packaging and Uploading your Algorithm for use with Amazon SageMaker\n",
80+
"## Part 1: Packaging and Uploading your Algorithm for use with Amazon SageMaker\n",
8181
"\n",
8282
"### An overview of Docker\n",
8383
"\n",
@@ -303,7 +303,7 @@
303303
"cell_type": "markdown",
304304
"metadata": {},
305305
"source": [
306-
"# Part 2: Using your Algorithm in Amazon SageMaker\n",
306+
"## Part 2: Using your Algorithm in Amazon SageMaker\n",
307307
"\n",
308308
"Once you have your container packaged, you can use it to train models and use the model for hosting or batch transforms. Let's do that with the algorithm we made above.\n",
309309
"\n",

advanced_functionality/search/ml_experiment_management_using_search.ipynb

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,7 @@
3535
"tags": []
3636
},
3737
"source": [
38-
"# Introduction\n",
38+
"## Introduction\n",
3939
"\n",
4040
"Welcome to our example introducing Amazon SageMaker Search! Amazon SageMaker Search lets you quickly find and evaluate the most relevant model training runs from potentially hundreds and thousands of your Amazon SageMaker model training jobs.\n",
4141
"Developing a machine learning model requires continuous experimentation, trying new learning algorithms and tuning hyper parameters, all the while observing the impact of such changes on model performance and accuracy. This iterative exercise often leads to explosion of hundreds of model training experiments and model versions, slowing down the convergence and discovery of “winning” model. In addition, the information explosion makes it very hard down the line to trace back the lineage of a model version i.e. the unique combination of datasets, algorithms and parameters that brewed that model in the first place. \n",
@@ -292,7 +292,7 @@
292292
"tags": []
293293
},
294294
"source": [
295-
"# Training the linear model\n",
295+
"## Training the linear model\n",
296296
"Once we have the data preprocessed and available in the correct format for training, the next step is to actually train the model using the data. First, let's specify our algorithm container. More details on algorithm containers can be found in [AWS documentation](https://docs.aws.amazon.com/sagemaker/latest/dg/sagemaker-algo-docker-registry-paths.html)."
297297
]
298298
},
@@ -457,7 +457,7 @@
457457
"tags": []
458458
},
459459
"source": [
460-
"# Use Amazon SageMaker Search to organize and evaluate experiments\n",
460+
"## Use Amazon SageMaker Search to organize and evaluate experiments\n",
461461
"Usually you will experiment with tuning multiple hyperparameters or even try new learning algorithms and training datasets resulting in potentially hundreds of model training runs and model versions. However, for the sake of simplicity, we are only tuning mini_batch_size in this example, trying only three different values resulting in as many model versions. Now we will use [Search](https://docs.aws.amazon.com/sagemaker/latest/dg/search.html) to **group together** the three model training runs and **evaluate** the best performing model by ranking and comparing them on a metric of our choice. \n",
462462
"\n",
463463
"**For grouping** the relevant model training runs together, we will search the model training jobs by the unique label or tag that we have been using as a tracking label to track our experiments. \n",
@@ -556,7 +556,7 @@
556556
"tags": []
557557
},
558558
"source": [
559-
"# Set up hosting for the model\n",
559+
"## Set up hosting for the model\n",
560560
"Now that we've found our best performing model (in this example the one with mini_batch_size=100), we can deploy it behind an Amazon SageMaker real-time hosted endpoint. This will allow out to make predictions (or inference) from the model dyanamically."
561561
]
562562
},
@@ -601,7 +601,7 @@
601601
"tags": []
602602
},
603603
"source": [
604-
"# Tracing the lineage of a model starting from an endpoint\n",
604+
"## Tracing the lineage of a model starting from an endpoint\n",
605605
"Now we will present an example of how you can use the Amazon SageMaker Search to trace the antecedents of a model deployed at an endpoint i.e. unique combination of algorithms, datasets, and parameters that brewed the model in first place."
606606
]
607607
},

advanced_functionality/tensorflow_bring_your_own/tensorflow_bring_your_own.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -72,7 +72,7 @@
7272
"cell_type": "markdown",
7373
"metadata": {},
7474
"source": [
75-
"# Part 1: Packaging and Uploading your Algorithm for use with Amazon SageMaker\n",
75+
"## Part 1: Packaging and Uploading your Algorithm for use with Amazon SageMaker\n",
7676
"\n",
7777
"### An overview of Docker\n",
7878
"\n",
@@ -449,7 +449,7 @@
449449
"cell_type": "markdown",
450450
"metadata": {},
451451
"source": [
452-
"# Part 2: Training and Hosting your Algorithm in Amazon SageMaker\n",
452+
"## Part 2: Training and Hosting your Algorithm in Amazon SageMaker\n",
453453
"Once you have your container packaged, you can use it to train and serve models. Let's do that with the algorithm we made above.\n",
454454
"\n",
455455
"## Set up the environment\n",
@@ -608,7 +608,7 @@
608608
"cell_type": "markdown",
609609
"metadata": {},
610610
"source": [
611-
"# Reference\n",
611+
"## Reference\n",
612612
"- [How Amazon SageMaker interacts with your Docker container for training](https://docs.aws.amazon.com/sagemaker/latest/dg/your-algorithms-training-algo.html)\n",
613613
"- [How Amazon SageMaker interacts with your Docker container for inference](https://docs.aws.amazon.com/sagemaker/latest/dg/your-algorithms-inference-code.html)\n",
614614
"- [CIFAR-10 Dataset](https://www.cs.toronto.edu/~kriz/cifar.html)\n",

autopilot/custom-feature-selection/Feature_selection_autopilot.ipynb

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010
"In some cases, customer wants to have the flexibility to bring custom data processing code to SageMaker Autopilot. For example, customer might have datasets with large number of independent variables. Customer would like to have a custom feature selection step to remove irrelevant variables first. The resulted smaller dataset is then used to launch SageMaker Autopilot job. Customer would also like to include both the custom processing code and models from SageMaker Autopilot for easy deployment—either on a real-time endpoint or for batch processing. We will demonstrate how to achieve this in this notebook. \n",
1111
"\n",
1212
"\n",
13-
"### Table of contents\n",
13+
"## Table of contents\n",
1414
"* [Setup](#setup)\n",
1515
" * [Generate dataset](#data_gene)\n",
1616
" * [Upload data to S3](#upload)\n",
@@ -29,7 +29,7 @@
2929
"cell_type": "markdown",
3030
"metadata": {},
3131
"source": [
32-
"# Setup <a class=\"anchor\" id=\"setup\"></a>"
32+
"## Setup <a class=\"anchor\" id=\"setup\"></a>"
3333
]
3434
},
3535
{
@@ -139,7 +139,7 @@
139139
"cell_type": "markdown",
140140
"metadata": {},
141141
"source": [
142-
"# Feature Selection <a class=\"anchor\" id=\"feature_selection\"></a>\n",
142+
"## Feature Selection <a class=\"anchor\" id=\"feature_selection\"></a>\n",
143143
"\n",
144144
"We use Scikit-learn on Sagemaker `SKLearn` Estimator with a feature selection script as an entry point. The script is very similar to a training script you might run outside of SageMaker, but you can access useful properties about the training environment through various environment variables, such as:\n",
145145
"\n",
@@ -439,7 +439,7 @@
439439
"cell_type": "markdown",
440440
"metadata": {},
441441
"source": [
442-
"# Autopilot <a class=\"anchor\" id=\"autopilot\"></a>"
442+
"## Autopilot <a class=\"anchor\" id=\"autopilot\"></a>"
443443
]
444444
},
445445
{
@@ -652,7 +652,7 @@
652652
"cell_type": "markdown",
653653
"metadata": {},
654654
"source": [
655-
"# Serial Inference Pipeline that combines feature selection and autopilot <a class=\"anchor\" id=\"serial_inference\"></a>\n"
655+
"## Serial Inference Pipeline that combines feature selection and autopilot <a class=\"anchor\" id=\"serial_inference\"></a>\n"
656656
]
657657
},
658658
{

aws_marketplace/creating_marketplace_products/algorithms/Bring_Your_Own-Creating_Algorithm_and_Model_Package.ipynb

Lines changed: 23 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@
7777
"cell_type": "markdown",
7878
"metadata": {},
7979
"source": [
80-
"# Part 1: Packaging and Uploading your Algorithm for use with Amazon SageMaker\n",
80+
"## Part 1: Packaging and Uploading your Algorithm for use with Amazon SageMaker\n",
8181
"\n",
8282
"### An overview of Docker\n",
8383
"\n",
@@ -286,7 +286,7 @@
286286
"cell_type": "markdown",
287287
"metadata": {},
288288
"source": [
289-
"## Testing your algorithm on your local machine or on an Amazon SageMaker notebook instance\n",
289+
"### Testing your algorithm on your local machine or on an Amazon SageMaker notebook instance\n",
290290
"\n",
291291
"While you're first packaging an algorithm use with Amazon SageMaker, you probably want to test it yourself to make sure it's working right. In the directory `container/local_test`, there is a framework for doing this. It includes three shell scripts for running and using the container and a directory structure that mimics the one outlined above.\n",
292292
"\n",
@@ -303,11 +303,11 @@
303303
"cell_type": "markdown",
304304
"metadata": {},
305305
"source": [
306-
"# Part 2: Training, Batch Inference and Hosting your Algorithm in Amazon SageMaker\n",
306+
"## Part 2: Training, Batch Inference and Hosting your Algorithm in Amazon SageMaker\n",
307307
"\n",
308308
"Once you have your container packaged, you can use it to train and serve models. Let's do that with the algorithm we made above.\n",
309309
"\n",
310-
"## Set up the environment\n",
310+
"### Set up the environment\n",
311311
"\n",
312312
"Here we specify a bucket to use and the role that will be used for working with Amazon SageMaker."
313313
]
@@ -333,7 +333,7 @@
333333
"cell_type": "markdown",
334334
"metadata": {},
335335
"source": [
336-
"## Create the session\n",
336+
"### Create the session\n",
337337
"\n",
338338
"The session remembers our connection parameters to Amazon SageMaker. We'll use it to perform all of our SageMaker operations."
339339
]
@@ -353,7 +353,7 @@
353353
"cell_type": "markdown",
354354
"metadata": {},
355355
"source": [
356-
"## Upload the data for training\n",
356+
"### Upload the data for training\n",
357357
"\n",
358358
"When training large models with huge amounts of data, you'll typically use big data tools, like Amazon Athena, AWS Glue, or Amazon EMR, to create your data in S3. For the purposes of this example, we're using some the classic [Iris dataset](https://en.wikipedia.org/wiki/Iris_flower_data_set), which we have included. \n",
359359
"\n",
@@ -376,7 +376,7 @@
376376
"cell_type": "markdown",
377377
"metadata": {},
378378
"source": [
379-
"## Create an estimator and fit the model\n",
379+
"### Create an estimator and fit the model\n",
380380
"\n",
381381
"In order to use Amazon SageMaker to fit our algorithm, we'll create an `Estimator` that defines how to use the container to train. This includes the configuration we need to invoke SageMaker training:\n",
382382
"\n",
@@ -422,7 +422,7 @@
422422
"cell_type": "markdown",
423423
"metadata": {},
424424
"source": [
425-
"## Batch Transform Job\n",
425+
"### Batch Transform Job\n",
426426
"\n",
427427
"Now let's use the model built to run a batch inference job and verify it works.\n"
428428
]
@@ -431,7 +431,7 @@
431431
"cell_type": "markdown",
432432
"metadata": {},
433433
"source": [
434-
"### Batch Transform Input Preparation\n",
434+
"#### Batch Transform Input Preparation\n",
435435
"\n",
436436
"The snippet below is removing the \"label\" column (column indexed at 0) and retaining the rest to be batch transform's input. \n",
437437
"\n",
@@ -463,7 +463,7 @@
463463
"cell_type": "markdown",
464464
"metadata": {},
465465
"source": [
466-
"### Run Batch Transform\n",
466+
"#### Run Batch Transform\n",
467467
"\n",
468468
"Now that our batch transform input is setup, we run the transformation job next"
469469
]
@@ -485,7 +485,7 @@
485485
"cell_type": "markdown",
486486
"metadata": {},
487487
"source": [
488-
"#### Inspect the Batch Transform Output in S3"
488+
"##### Inspect the Batch Transform Output in S3"
489489
]
490490
},
491491
{
@@ -511,7 +511,7 @@
511511
"cell_type": "markdown",
512512
"metadata": {},
513513
"source": [
514-
"## Deploy the model\n",
514+
"### Deploy the model\n",
515515
"\n",
516516
"Deploying the model to Amazon SageMaker hosting just requires a `deploy` call on the fitted model. This call takes an instance count, instance type, and optionally serializer and deserializer functions. These are used when the resulting predictor is created on the endpoint."
517517
]
@@ -532,7 +532,7 @@
532532
"cell_type": "markdown",
533533
"metadata": {},
534534
"source": [
535-
"### Choose some data and use it for a prediction\n",
535+
"#### Choose some data and use it for a prediction\n",
536536
"\n",
537537
"In order to do some predictions, we'll extract some of the data we used for training and do predictions against it. This is, of course, bad statistical practice, but a good way to see how the mechanism works."
538538
]
@@ -576,7 +576,7 @@
576576
"cell_type": "markdown",
577577
"metadata": {},
578578
"source": [
579-
"### Cleanup Endpoint\n",
579+
"#### Cleanup Endpoint\n",
580580
"\n",
581581
"When you're done with the endpoint, you'll want to clean it up."
582582
]
@@ -594,7 +594,7 @@
594594
"cell_type": "markdown",
595595
"metadata": {},
596596
"source": [
597-
"# Part 3 - Package your resources as an Amazon SageMaker Algorithm\n",
597+
"## Part 3 - Package your resources as an Amazon SageMaker Algorithm\n",
598598
"(If you looking to sell a pretrained model (ModelPackage), please skip to Part 4.)\n",
599599
"\n",
600600
"Now that you have verified that the algorithm code works for training, live inference and batch inference in the above sections, you can start packaging it up as an Amazon SageMaker Algorithm."
@@ -615,7 +615,7 @@
615615
"cell_type": "markdown",
616616
"metadata": {},
617617
"source": [
618-
"## Algorithm Definition\n",
618+
"### Algorithm Definition\n",
619619
"\n",
620620
"SageMaker Algorithm is comprised of 2 parts:\n",
621621
"\n",
@@ -741,7 +741,7 @@
741741
"cell_type": "markdown",
742742
"metadata": {},
743743
"source": [
744-
"## Putting it all together\n",
744+
"### Putting it all together\n",
745745
"\n",
746746
"Now we put all the pieces together in the next cell and create an Amazon SageMaker Algorithm"
747747
]
@@ -777,7 +777,7 @@
777777
"cell_type": "markdown",
778778
"metadata": {},
779779
"source": [
780-
"### Describe the algorithm\n",
780+
"#### Describe the algorithm\n",
781781
"\n",
782782
"The next cell describes the Algorithm and waits until it reaches a terminal state (Completed or Failed)"
783783
]
@@ -805,11 +805,11 @@
805805
"cell_type": "markdown",
806806
"metadata": {},
807807
"source": [
808-
"# Part 4 - Package your resources as an Amazon SageMaker ModelPackage\n",
808+
"## Part 4 - Package your resources as an Amazon SageMaker ModelPackage\n",
809809
"\n",
810810
"In this section, we will see how you can package your artifacts (ECR image and the trained artifact from your previous training job) into a ModelPackage. Once you complete this, you can list your product as a pretrained model in the AWS Marketplace.\n",
811811
"\n",
812-
"## Model Package Definition\n",
812+
"### Model Package Definition\n",
813813
"A Model Package is a reusable model artifacts abstraction that packages all ingredients necessary for inference. It consists of an inference specification that defines the inference image to use along with an optional model weights location.\n"
814814
]
815815
},
@@ -892,7 +892,7 @@
892892
"cell_type": "markdown",
893893
"metadata": {},
894894
"source": [
895-
"## Putting it all together\n",
895+
"### Putting it all together\n",
896896
"\n",
897897
"Now we put all the pieces together in the next cell and create an Amazon SageMaker Model Package."
898898
]
@@ -951,7 +951,7 @@
951951
"cell_type": "markdown",
952952
"metadata": {},
953953
"source": [
954-
"## Debugging Creation Issues\n",
954+
"### Debugging Creation Issues\n",
955955
"\n",
956956
"Entity creation typically never fails in the synchronous path. However, the validation process can fail for many reasons. If the above Algorithm creation fails, you can investigate the cause for the failure by looking at the \"AlgorithmStatusDetails\" field in the Algorithm object or \"ModelPackageStatusDetails\" field in the ModelPackage object. You can also look for the Training Jobs / Transform Jobs created in your account as part of our validation and inspect their logs for more hints on what went wrong. \n",
957957
"\n",
@@ -963,7 +963,7 @@
963963
"metadata": {},
964964
"source": [
965965
"\n",
966-
"## List on AWS Marketplace\n",
966+
"### List on AWS Marketplace\n",
967967
"\n",
968968
"Next, please go back to the Amazon SageMaker console, click on \"Algorithms\" (or \"Model Packages\") and you'll find the entity you created above. If it was successfully created and validated, you should be able to select the entity and \"Publish new ML Marketplace listing\" from SageMaker console.\n",
969969
"<img src=\"images/publish-to-marketplace-action.png\"/>"

0 commit comments

Comments
 (0)