Skip to content

Commit 0e62cd5

Browse files
authored
Updated input parameters
1 parent 5542dd3 commit 0e62cd5

File tree

1 file changed

+3
-43
lines changed

1 file changed

+3
-43
lines changed

doc/workflows/kubernetes/using_amazon_sagemaker_components.rst

Lines changed: 3 additions & 43 deletions
Original file line numberDiff line numberDiff line change
@@ -499,54 +499,14 @@ parameters for each component of your pipeline. These parameters can
499499
also be updated when using other pipelines. We have provided default
500500
values for all parameters in the sample classification pipeline file.
501501

502-
The following are the only parameters you may need to modify to run the
503-
sample pipelines. To modify these parameters, update their entries in
504-
the sample classification pipeline file.
502+
The following are the only parameters you need to pass to run the
503+
sample pipelines. To pass these parameters, update their entries when creating a new run.
505504

506505
- **Role-ARN:** This must be the ARN of an IAM role that has full
507506
Amazon SageMaker access in your AWS account. Use the ARN
508507
of  ``kfp-example-pod-role``.
509508
510-
- **The Dataset Buckets**: You must change the S3 bucket with the input
511-
data for each of the components. Replace the following with the link
512-
to your S3 bucket:
513-
514-
- **Train channel:** ``"S3Uri": "s3://<your-s3-bucket-name>/data"``
515-
516-
- **HPO channels for test/HPO channel for
517-
train:** ``"S3Uri": "s3://<your-s3-bucket-name>/data"``
518-
519-
- **Batch
520-
transform:** ``"batch-input": "s3://<your-s3-bucket-name>/data"``
521-
522-
- **Output buckets:** Replace the output buckets with S3 buckets you
523-
have write permission to. Replace the following with the link to your
524-
S3 bucket:
525-
526-
- **Training/HPO**:
527-
``output_location='s3://<your-s3-bucket-name>/output'``
528-
529-
- **Batch Transform**:
530-
``batch_transform_ouput='s3://<your-s3-bucket-name>/output'``
531-
532-
- **Region:**\ The default pipelines work in us-east-1. If your
533-
cluster is in a different region, update the following:
534-
535-
- The ``region='us-east-1'`` Parameter in the input list.
536-
537-
- The algorithm images for Amazon SageMaker. If you use one of
538-
the Amazon SageMaker built-in algorithm images, select the image
539-
for your region. Construct the image name using the information
540-
in `Common parameters for built-in
541-
algorithms <https://docs.aws.amazon.com/sagemaker/latest/dg/sagemaker-algo-docker-registry-paths.html>`__.
542-
For Example:
543-
544-
::
545-
546-
382416733822.dkr.ecr.us-east-1.amazonaws.com/kmeans:1
547-
548-
- The S3 buckets with the dataset. Use the steps in Prepare datasets
549-
to copy the data to a bucket in the same region as the cluster.
509+
- **The Dataset Bucket**: This is the name of the S3 bucket that you uploaded the ``kmeans_preprocessing.py`` file to.
550510

551511
You can adjust any of the input parameters using the KFP UI and trigger
552512
your run again.

0 commit comments

Comments
 (0)