-
Notifications
You must be signed in to change notification settings - Fork 608
Introduce extension/llm/export_llm #11746
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/11746
Note: Links to docs will display an error until the docs builds have been completed. ❌ 6 New FailuresAs of commit 38e1edc with merge base 7bd15b9 ( NEW FAILURES - The following jobs have failed:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This pull request was exported from Phabricator. Differential Revision: D76781745 |
This PR needs a
|
This pull request was exported from Phabricator. Differential Revision: D76781745 |
1 similar comment
This pull request was exported from Phabricator. Differential Revision: D76781745 |
Summary: Pull Request resolved: pytorch#11746 Introduces frontend of export_llm in extension/llm, while keeping most of the code still in examples/models/llama as a first step. Differential Revision: D76781745
26bf643
to
68c467e
Compare
This pull request was exported from Phabricator. Differential Revision: D76781745 |
1 similar comment
This pull request was exported from Phabricator. Differential Revision: D76781745 |
68c467e
to
8833214
Compare
Summary: Pull Request resolved: pytorch#11746 Introduces frontend of export_llm in extension/llm, while keeping most of the code still in examples/models/llama as a first step. Differential Revision: D76781745
This pull request was exported from Phabricator. Differential Revision: D76781745 |
Summary: Pull Request resolved: pytorch#11746 Introduces frontend of export_llm in extension/llm, while keeping most of the code still in examples/models/llama as a first step. Differential Revision: D76781745
This pull request was exported from Phabricator. Differential Revision: D76781745 |
8833214
to
5122e42
Compare
This pull request was exported from Phabricator. Differential Revision: D76781745 |
Summary: Pull Request resolved: pytorch#11746 Introduces frontend of export_llm in extension/llm, while keeping most of the code still in examples/models/llama as a first step. Reviewed By: larryliu0820 Differential Revision: D76781745
5122e42
to
661bba9
Compare
This pull request was exported from Phabricator. Differential Revision: D76781745 |
1 similar comment
This pull request was exported from Phabricator. Differential Revision: D76781745 |
Summary: Pull Request resolved: pytorch#11746 Introduces frontend of export_llm in extension/llm, while keeping most of the code still in examples/models/llama as a first step. Reviewed By: larryliu0820 Differential Revision: D76781745
661bba9
to
e091538
Compare
This pull request was exported from Phabricator. Differential Revision: D76781745 |
1 similar comment
This pull request was exported from Phabricator. Differential Revision: D76781745 |
Summary: Pull Request resolved: pytorch#11746 Introduces frontend of export_llm in extension/llm, while keeping most of the code still in examples/models/llama as a first step. Reviewed By: larryliu0820 Differential Revision: D76781745
e091538
to
6e74c8c
Compare
This pull request was exported from Phabricator. Differential Revision: D76781745 |
1 similar comment
This pull request was exported from Phabricator. Differential Revision: D76781745 |
Summary: Pull Request resolved: pytorch#11746 Introduces frontend of export_llm in extension/llm, while keeping most of the code still in examples/models/llama as a first step. Reviewed By: larryliu0820 Differential Revision: D76781745
This pull request was exported from Phabricator. Differential Revision: D76781745 |
6e74c8c
to
7d1b94c
Compare
This pull request was exported from Phabricator. Differential Revision: D76781745 |
Summary: Pull Request resolved: pytorch#11746 Introduces frontend of export_llm in extension/llm, while keeping most of the code still in examples/models/llama as a first step. Reviewed By: larryliu0820 Differential Revision: D76781745
7d1b94c
to
aac60e2
Compare
This pull request was exported from Phabricator. Differential Revision: D76781745 |
2 similar comments
This pull request was exported from Phabricator. Differential Revision: D76781745 |
This pull request was exported from Phabricator. Differential Revision: D76781745 |
Summary: Pull Request resolved: pytorch#11746 Introduces frontend of export_llm in extension/llm, while keeping most of the code still in examples/models/llama as a first step. Reviewed By: larryliu0820 Differential Revision: D76781745
aac60e2
to
4de4916
Compare
This pull request was exported from Phabricator. Differential Revision: D76781745 |
Summary: Pull Request resolved: pytorch#11746 Introduces frontend of export_llm in extension/llm, while keeping most of the code still in examples/models/llama as a first step. Reviewed By: larryliu0820 Differential Revision: D76781745
This pull request was exported from Phabricator. Differential Revision: D76781745 |
4de4916
to
327d41a
Compare
This pull request was exported from Phabricator. Differential Revision: D76781745 |
Summary: Pull Request resolved: pytorch#11746 Introduces frontend of export_llm in extension/llm, while keeping most of the code still in examples/models/llama as a first step. Reviewed By: larryliu0820 Differential Revision: D76781745
327d41a
to
38e1edc
Compare
This pull request was exported from Phabricator. Differential Revision: D76781745 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please see inline.
- Attach the PRs into the issue #11527
- Please add me as reviewers for future extensions/llm directory (https://github.com/pytorch/executorch/pull/11785/files)
|
||
""" | ||
Export an LLM with ExecuTorch. Currently follows the following steps: | ||
1. Instantiate our custom PyTorch transformer definition from examples/llama/models/llama_transformer.py. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't mention implementation details in the docblock.
If it's a public API, the docblock should contain description of the contract.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will move this information to a README
Summary: Introduces frontend of export_llm in extension/llm, while keeping most of the code still in examples/models/llama as a first step.
Differential Revision: D76781745