You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: examples/dynamo/mutable_torchtrt_module_example.py
+71-5Lines changed: 71 additions & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -11,12 +11,13 @@
11
11
The Mutable Torch TensorRT Module is designed to address these challenges, making interaction with the Torch-TensorRT module easier than ever.
12
12
13
13
In this tutorial, we are going to walk through
14
-
1. Sample workflow of Mutable Torch TensorRT Module with ResNet 18
15
-
2. Save a Mutable Torch TensorRT Module
16
-
3. Integration with Huggingface pipeline in LoRA use case
17
-
4. Usage of dynamic shape with Mutable Torch TensorRT Module
14
+
1. Sample workflow of Mutable Torch TensorRT Module with ResNet 18
15
+
2. Save a Mutable Torch TensorRT Module
16
+
3. Integration with Huggingface pipeline in LoRA use case
17
+
4. Usage of dynamic shape with Mutable Torch TensorRT Module
18
18
"""
19
19
20
+
# %%
20
21
importnumpyasnp
21
22
importtorch
22
23
importtorch_tensorrtastorch_trt
@@ -144,6 +145,12 @@
144
145
# %%
145
146
# Use Mutable Torch TensorRT module with dynamic shape
146
147
# ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
148
+
# When adding dynamic shape hint to MutableTorchTensorRTModule, The shape hint should EXACTLY follow the semantics of arg_inputs and kwarg_inputs passed to the forward function
149
+
# and should not omit any entries (except None in the kwarg_inputs). If there is a nested dict/list in the input, the dynamic shape for that entry should also be an nested dict/list.
150
+
# If the dynamic shape is not required for an input, an empty dictionary should be given as the shape hint for that input.
151
+
# Note that you should exclude keyword arguments with value None as those will be filtered out.
152
+
153
+
147
154
classModel(torch.nn.Module):
148
155
def__init__(self):
149
156
super().__init__()
@@ -167,7 +174,10 @@ def forward(self, a, b, c={}):
167
174
dim_2=torch.export.Dim("dim2", min=1, max=50)
168
175
args_dynamic_shapes= ({1: dim_1}, {0: dim_0})
169
176
kwarg_dynamic_shapes= {
170
-
"c": {"a": {}, "b": {0: dim_2}},
177
+
"c": {
178
+
"a": {},
179
+
"b": {0: dim_2},
180
+
}, # a's shape does not change so we give it an empty dict
171
181
}
172
182
# Export the model first with custom dynamic shape constraints
0 commit comments