Skip to content

Commit 44afc92

Browse files
committed
Add LambdaModel and LambdaPredictor documentation
1 parent 9cfc6c2 commit 44afc92

File tree

3 files changed

+52
-0
lines changed

3 files changed

+52
-0
lines changed

doc/api/inference/model.rst

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,3 +15,8 @@ Model
1515
:members:
1616
:undoc-members:
1717
:show-inheritance:
18+
19+
.. autoclass:: sagemaker.serverless.model.LambdaModel
20+
:members:
21+
:undoc-members:
22+
:show-inheritance:

doc/api/inference/predictors.rst

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,3 +7,8 @@ Make real-time predictions against SageMaker endpoints with Python objects
77
:members:
88
:undoc-members:
99
:show-inheritance:
10+
11+
.. autoclass:: sagemaker.serverless.predictor.LambdaPredictor
12+
:members:
13+
:undoc-members:
14+
:show-inheritance:

doc/overview.rst

Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1063,6 +1063,48 @@ You can also find these notebooks in the **Advanced Functionality** section of t
10631063
For information about using sample notebooks in a SageMaker notebook instance, see `Use Example Notebooks <https://docs.aws.amazon.com/sagemaker/latest/dg/howitworks-nbexamples.html>`__
10641064
in the AWS documentation.
10651065
1066+
*******************
1067+
Serverless Inference
1068+
*******************
1069+
1070+
You can use the SageMaker Python SDK to perform serverless inference on Lambda.
1071+
1072+
To deploy models to Lambda, you must complete the following prerequisites:
1073+
1. `Package your model and inference code as a container image. <https://docs.aws.amazon.com/lambda/latest/dg/images-create.html>`_
1074+
2. `Create a role that lists Lambda as a trusted entity. <https://docs.aws.amazon.com/lambda/latest/dg/lambda-intro-execution-role.html#permissions-executionrole-console>`_
1075+
1076+
After completing the prerequisites, you can deploy your model to Lambda with
1077+
the ``LambdaModel`` class.
1078+
1079+
.. code:: python
1080+
1081+
from sagemaker.serverless import LambdaModel
1082+
1083+
image_uri = f"{account}.dkr.ecr.{region}.amazonaws.com/{repository}:latest"
1084+
role = f"arn:aws:iam::{account}:role/{role}"
1085+
1086+
model = LambdaModel(image_uri=image_uri, role=role)
1087+
predictor = model.deploy("my-lambda-function", timeout=20, memory_size=4092)
1088+
1089+
The ``LambdaModel.deploy`` method returns a ``LambdaPredictor`` instance. Use
1090+
the ``LambdaPredictor`` instance to perform inference on Lambda.
1091+
1092+
.. code:: python
1093+
1094+
url = "https://c.files.bbci.co.uk/12A9B/production/_111434467_gettyimages-1143489763.jpg"
1095+
predictor.predict({"url": url}) # {'class': 'tabby'}
1096+
1097+
Once you are done performing inference on Lambda, delete the ``LambdaModel``
1098+
and ``LambdaPredictor`` instances.
1099+
1100+
.. code:: python
1101+
1102+
model.delete_model()
1103+
predictor.delete_predictor()
1104+
1105+
For more details, see the API reference for `LambdaModel <https://sagemaker.readthedocs.io/en/stable/api/inference/model.html#sagemaker.serverless.model.LambdaModel>`_
1106+
and `LambdaPredictor <https://sagemaker.readthedocs.io/en/stable/api/inference/predictor.html#sagemaker.serverless.predictor.LambdaPredictor>`_.
1107+
10661108
******************
10671109
SageMaker Workflow
10681110
******************

0 commit comments

Comments
 (0)