Skip to content

Remove llama related stuff out of bpe_tokenizer #4235

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 5 commits into from

Conversation

helunwencser
Copy link
Contributor

@helunwencser helunwencser commented Jul 12, 2024

Stack from ghstack (oldest at bottom):

We don't need to initialize vocab_, vocab_scores_, etc. They will be initialized anyway while loading the tokenizer binary. A benefit of removing them is that we can remove these llama related default values and make bpe_tokenizer agnostic to models.

Differential Revision: D59664556

We don't need to initialize `vocab_`, `vocab_scores_`, etc. They will be initialized anyway while loading the tokenizer binary. A benefit of removing them is that we can remove these llama related default values and make `bpe_tokenizer` agnostic to models.

Differential Revision: [D59664556](https://our.internmc.facebook.com/intern/diff/D59664556/)

[ghstack-poisoned]
Copy link

pytorch-bot bot commented Jul 12, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/4235

Note: Links to docs will display an error until the docs builds have been completed.

❗ 2 Active SEVs

There are 2 currently active SEVs. If your PR is affected, please view them below:

✅ You can merge normally! (1 Unrelated Failure)

As of commit 067c2ed with merge base 4b45264 (image):

BROKEN TRUNK - The following job failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jul 12, 2024
helunwencser added a commit that referenced this pull request Jul 12, 2024
We don't need to initialize `vocab_`, `vocab_scores_`, etc. They will be initialized anyway while loading the tokenizer binary. A benefit of removing them is that we can remove these llama related default values and make `bpe_tokenizer` agnostic to models.

Differential Revision: [D59664556](https://our.internmc.facebook.com/intern/diff/D59664556/)

ghstack-source-id: 233477835
Pull Request resolved: #4235
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D59664556

We don't need to initialize `vocab_`, `vocab_scores_`, etc. They will be initialized anyway while loading the tokenizer binary. A benefit of removing them is that we can remove these llama related default values and make `bpe_tokenizer` agnostic to models.

Differential Revision: [D59664556](https://our.internmc.facebook.com/intern/diff/D59664556/)

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D59664556

helunwencser added a commit that referenced this pull request Jul 12, 2024
Pull Request resolved: #4235

We don't need to initialize `vocab_`, `vocab_scores_`, etc. They will be initialized anyway while loading the tokenizer binary. A benefit of removing them is that we can remove these llama related default values and make `bpe_tokenizer` agnostic to models.

Differential Revision: [D59664556](https://our.internmc.facebook.com/intern/diff/D59664556/)
ghstack-source-id: 233552418
We don't need to initialize `vocab_`, `vocab_scores_`, etc. They will be initialized anyway while loading the tokenizer binary. A benefit of removing them is that we can remove these llama related default values and make `bpe_tokenizer` agnostic to models.

Differential Revision: [D59664556](https://our.internmc.facebook.com/intern/diff/D59664556/)

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D59664556

helunwencser added a commit that referenced this pull request Jul 12, 2024
Pull Request resolved: #4235

We don't need to initialize `vocab_`, `vocab_scores_`, etc. They will be initialized anyway while loading the tokenizer binary. A benefit of removing them is that we can remove these llama related default values and make `bpe_tokenizer` agnostic to models.
ghstack-source-id: 233578697

Differential Revision: [D59664556](https://our.internmc.facebook.com/intern/diff/D59664556/)
We don't need to initialize `vocab_`, `vocab_scores_`, etc. They will be initialized anyway while loading the tokenizer binary. A benefit of removing them is that we can remove these llama related default values and make `bpe_tokenizer` agnostic to models.

Differential Revision: [D59664556](https://our.internmc.facebook.com/intern/diff/D59664556/)

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D59664556

helunwencser added a commit that referenced this pull request Jul 12, 2024
Pull Request resolved: #4235

We don't need to initialize `vocab_`, `vocab_scores_`, etc. They will be initialized anyway while loading the tokenizer binary. A benefit of removing them is that we can remove these llama related default values and make `bpe_tokenizer` agnostic to models.
ghstack-source-id: 233588007

Differential Revision: [D59664556](https://our.internmc.facebook.com/intern/diff/D59664556/)
We don't need to initialize `vocab_`, `vocab_scores_`, etc. They will be initialized anyway while loading the tokenizer binary. A benefit of removing them is that we can remove these llama related default values and make `bpe_tokenizer` agnostic to models.

Differential Revision: [D59664556](https://our.internmc.facebook.com/intern/diff/D59664556/)

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D59664556

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 8775280.

kedarnath03 pushed a commit to kedarnath03/executorch that referenced this pull request Jun 25, 2025
Pull Request resolved: pytorch/executorch#4235

We don't need to initialize `vocab_`, `vocab_scores_`, etc. They will be initialized anyway while loading the tokenizer binary. A benefit of removing them is that we can remove these llama related default values and make `bpe_tokenizer` agnostic to models.
ghstack-source-id: 233769845

Differential Revision: [D59664556](https://our.internmc.facebook.com/intern/diff/D59664556/)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants