-
Notifications
You must be signed in to change notification settings - Fork 607
Lower phi3 mini with LoRA to edge for training #4722
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/4722
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 2876c00 with merge base c6347f3 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
@dvorjackz has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
1 similar comment
@dvorjackz has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
Summary: Exports phi3-mini with LoRA for training. The process involves AOT Autograd tracing a backward graphs which is combined with the forward into a joint graph, which is then finally lowered to Executorch. Differential Revision: D61309917 Pulled By: dvorjackz
01974e0
to
4bab1f2
Compare
This pull request was exported from Phabricator. Differential Revision: D61309917 |
Summary: Exports phi3-mini with LoRA for training. The process involves AOT Autograd tracing a backward graphs which is combined with the forward into a joint graph, which is then finally lowered to Executorch. Differential Revision: D61309917 Pulled By: dvorjackz
4bab1f2
to
94ed14a
Compare
@dvorjackz has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
94ed14a
to
9914eef
Compare
Summary: Exports phi3-mini with LoRA for training. The process involves AOT Autograd tracing a backward graphs which is combined with the forward into a joint graph, which is then finally lowered to Executorch. Differential Revision: D61309917 Pulled By: dvorjackz
@dvorjackz has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
Summary: Exports phi3-mini with LoRA for training. The process involves AOT Autograd tracing a backward graphs which is combined with the forward into a joint graph, which is then finally lowered to Executorch. Reviewed By: helunwencser Differential Revision: D61309917 Pulled By: dvorjackz
9914eef
to
be32a62
Compare
This pull request was exported from Phabricator. Differential Revision: D61309917 |
Summary: Exports phi3-mini with LoRA for training. The process involves AOT Autograd tracing a backward graphs which is combined with the forward into a joint graph, which is then finally lowered to Executorch. Reviewed By: helunwencser Differential Revision: D61309917 Pulled By: dvorjackz
be32a62
to
2876c00
Compare
This pull request was exported from Phabricator. Differential Revision: D61309917 |
Exports phi3-mini with LoRA for training. The process involves AOT Autograd tracing a backward graphs which is combined with the forward into a joint graph, which is then finally lowered to Executorch.