Skip to content

Commit 37e7d92

Browse files
yushangdifacebook-github-bot
authored andcommitted
Back out "Revert export_for_training migration in llm/export/builder.py" (#6180)
Summary: Pull Request resolved: #6180 Revert back change since we have fixed the issue in pytorch/pytorch#137540 with D64080561. Reviewed By: dvorjackz Differential Revision: D64260221 fbshipit-source-id: 500af6faf02f160b67c4c010962a9b3a5312f3a7
1 parent e95aa9d commit 37e7d92

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

extension/llm/export/builder.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -29,10 +29,10 @@
2929

3030
from executorch.extension.export_util.utils import export_to_edge, save_pte_program
3131
from executorch.extension.llm.tokenizer.utils import get_tokenizer
32-
from torch._export import capture_pre_autograd_graph
3332
from torch.ao.quantization.quantize_pt2e import convert_pt2e, prepare_pt2e
3433
from torch.ao.quantization.quantizer import Quantizer
3534
from torch.ao.quantization.quantizer.composable_quantizer import ComposableQuantizer
35+
from torch.export import export_for_training
3636
from torch.nn.attention import SDPBackend
3737

3838
FORMAT = "[%(levelname)s %(asctime)s %(filename)s:%(lineno)s] %(message)s"
@@ -193,12 +193,12 @@ def capture_pre_autograd_graph(self) -> "LLMEdgeManager":
193193
strict=True,
194194
).module()
195195
else:
196-
self.pre_autograd_graph_module = capture_pre_autograd_graph(
196+
self.pre_autograd_graph_module = export_for_training(
197197
self.model,
198198
self.example_inputs,
199199
kwargs=self.example_kwarg_inputs,
200200
dynamic_shapes=dynamic_shape,
201-
)
201+
).module()
202202

203203
return self
204204

0 commit comments

Comments
 (0)