Skip to content

llama export with input vocab pruning #6421

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Oct 22, 2024

Conversation

navsud
Copy link
Contributor

@navsud navsud commented Oct 21, 2024

Summary:
D62143905 added llama model export with output vocab pruning. In the similar lines, this diff applies the same for input vocabulary pruning.

The assumption here is: we have trained the model with full vocab and we are pruning out the input vocab after the model training, at export time.

Reviewed By: iseeyuan

Differential Revision: D64723663

Copy link

pytorch-bot bot commented Oct 21, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/6421

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 30ee66a with merge base ca47839 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Oct 21, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D64723663

Copy link
Contributor

@iseeyuan iseeyuan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D64723663

navsud added a commit to navsud/executorch that referenced this pull request Oct 21, 2024
Summary:
Pull Request resolved: pytorch#6421

D62143905 added llama model export with output vocab pruning. In the similar lines, this diff applies the same for input vocabulary pruning.

The assumption here is: we have trained the model with full vocab and we are pruning out the input vocab after the model training, at export time.

Reviewed By: iseeyuan

Differential Revision: D64723663
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D64723663

navsud added a commit to navsud/executorch that referenced this pull request Oct 22, 2024
Summary:
Pull Request resolved: pytorch#6421

D62143905 added llama model export with output vocab pruning. In the similar lines, this diff applies the same for input vocabulary pruning.

The assumption here is: we have trained the model with full vocab and we are pruning out the input vocab after the model training, at export time.

Reviewed By: iseeyuan

Differential Revision: D64723663
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D64723663

navsud added a commit to navsud/executorch that referenced this pull request Oct 22, 2024
Summary:
Pull Request resolved: pytorch#6421

D62143905 added llama model export with output vocab pruning. In the similar lines, this diff applies the same for input vocabulary pruning.

The assumption here is: we have trained the model with full vocab and we are pruning out the input vocab after the model training, at export time.

Reviewed By: iseeyuan

Differential Revision: D64723663
navsud added a commit to navsud/executorch that referenced this pull request Oct 22, 2024
Summary:
Pull Request resolved: pytorch#6421

D62143905 added llama model export with output vocab pruning. In the similar lines, this diff applies the same for input vocabulary pruning.

The assumption here is: we have trained the model with full vocab and we are pruning out the input vocab after the model training, at export time.

Reviewed By: iseeyuan

Differential Revision: D64723663
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D64723663

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D64723663

navsud added a commit to navsud/executorch that referenced this pull request Oct 22, 2024
Summary:
Pull Request resolved: pytorch#6421

D62143905 added llama model export with output vocab pruning. In the similar lines, this diff applies the same for input vocabulary pruning.

The assumption here is: we have trained the model with full vocab and we are pruning out the input vocab after the model training, at export time.

Reviewed By: iseeyuan

Differential Revision: D64723663
Summary:
Pull Request resolved: pytorch#6421

D62143905 added llama model export with output vocab pruning. In the similar lines, this diff applies the same for input vocabulary pruning.

The assumption here is: we have trained the model with full vocab and we are pruning out the input vocab after the model training, at export time.

Reviewed By: iseeyuan

Differential Revision: D64723663
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D64723663

@facebook-github-bot facebook-github-bot merged commit 89ba47a into pytorch:main Oct 22, 2024
44 of 46 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants