Skip to content

[ExecuTorch] Support BFloat16 in CPUBlas gemm #5122

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 25 commits into from
Sep 9, 2024

Conversation

bfloat16.h was a stub. I've filled it out by porting the c10
implementation, added it to ET_SWITCH and ET_FORALL macros, and hooked
it up to promoteTypes. I extended the half_to_float argument to
promoteTypes to also coerce bfloat16 to float because I figured
anybody who wants to ignore half probably also wants to ignore bf16.

Differential Revision: [D61981361](https://our.internmc.facebook.com/intern/diff/D61981361/)

[ghstack-poisoned]
Adding bfloat16 support to important ops for LLMs to start.

Differential Revision: [D61981356](https://our.internmc.facebook.com/intern/diff/D61981356/)

[ghstack-poisoned]
Adding bfloat16 support to important ops for LLMs to start.

Differential Revision: [D61981355](https://our.internmc.facebook.com/intern/diff/D61981355/)

[ghstack-poisoned]
Adding bfloat16 support to important ops for LLMs to start.

Differential Revision: [D61981353](https://our.internmc.facebook.com/intern/diff/D61981353/)

[ghstack-poisoned]
Adding bfloat16 support to important ops for LLMs to start.

Differential Revision: [D61981357](https://our.internmc.facebook.com/intern/diff/D61981357/)

[ghstack-poisoned]
Adding bfloat16 support to important ops for LLMs to start.

Differential Revision: [D61981364](https://our.internmc.facebook.com/intern/diff/D61981364/)

[ghstack-poisoned]
Adding bfloat16 support to important ops for LLMs to start.

Differential Revision: [D61981360](https://our.internmc.facebook.com/intern/diff/D61981360/)

[ghstack-poisoned]
Adding bfloat16 support to important ops for LLMs to start.

Differential Revision: [D61981359](https://our.internmc.facebook.com/intern/diff/D61981359/)

[ghstack-poisoned]
Adding bfloat16 support to important ops for LLMs to start.

Differential Revision: [D61981362](https://our.internmc.facebook.com/intern/diff/D61981362/)

[ghstack-poisoned]
The LLM runner assumed that the data type could only be float or half. Suport bfloat16 and neaten up the code while we're at it.

Differential Revision: [D61981354](https://our.internmc.facebook.com/intern/diff/D61981354/)

**NOTE FOR REVIEWERS**: This PR has internal Meta-specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D61981354/)!

[ghstack-poisoned]
Support creating bf16 PTEs.

Differential Revision: [D61981363](https://our.internmc.facebook.com/intern/diff/D61981363/)

[ghstack-poisoned]
…n export_llama"

Support creating bf16 PTEs.

Differential Revision: [D61981363](https://our.internmc.facebook.com/intern/diff/D61981363/)

[ghstack-poisoned]
Support creating bf16 PTEs.

Differential Revision: [D61981363](https://our.internmc.facebook.com/intern/diff/D61981363/)

[ghstack-poisoned]
…n export_llama"

Support creating bf16 PTEs.

Differential Revision: [D61981363](https://our.internmc.facebook.com/intern/diff/D61981363/)

[ghstack-poisoned]
Support creating bf16 PTEs.

Differential Revision: [D61981363](https://our.internmc.facebook.com/intern/diff/D61981363/)

[ghstack-poisoned]
…n export_llama"

Support creating bf16 PTEs.

Differential Revision: [D61981363](https://our.internmc.facebook.com/intern/diff/D61981363/)

[ghstack-poisoned]
Support creating bf16 PTEs.

Differential Revision: [D61981363](https://our.internmc.facebook.com/intern/diff/D61981363/)

[ghstack-poisoned]
…n export_llama"

Support creating bf16 PTEs.

Differential Revision: [D61981363](https://our.internmc.facebook.com/intern/diff/D61981363/)

[ghstack-poisoned]
Support creating bf16 PTEs.

Differential Revision: [D61981363](https://our.internmc.facebook.com/intern/diff/D61981363/)

[ghstack-poisoned]
…n export_llama"

Support creating bf16 PTEs.

Differential Revision: [D61981363](https://our.internmc.facebook.com/intern/diff/D61981363/)

[ghstack-poisoned]
Support creating bf16 PTEs.

Differential Revision: [D61981363](https://our.internmc.facebook.com/intern/diff/D61981363/)

[ghstack-poisoned]
…n export_llama"

Support creating bf16 PTEs.

Differential Revision: [D61981363](https://our.internmc.facebook.com/intern/diff/D61981363/)

[ghstack-poisoned]
Support creating bf16 PTEs.

Differential Revision: [D61981363](https://our.internmc.facebook.com/intern/diff/D61981363/)

[ghstack-poisoned]
Copy link

pytorch-bot bot commented Sep 6, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/5122

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 9c7e8fb with merge base b69ae0c (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Sep 6, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D62151658

@swolchok swolchok changed the base branch from gh/swolchok/31/base to gh/swolchok/30/head September 6, 2024 00:13
Base automatically changed from gh/swolchok/30/head to main September 6, 2024 23:54
@facebook-github-bot facebook-github-bot merged commit 6b1e328 into main Sep 9, 2024
36 checks passed
@facebook-github-bot facebook-github-bot deleted the gh/swolchok/31/head branch September 9, 2024 19:32
kedarnath03 pushed a commit to kedarnath03/executorch that referenced this pull request Jun 25, 2025
Differential Revision: [D62151658](https://our.internmc.facebook.com/intern/diff/D62151658/)

ghstack-source-id: 241282450
Pull Request resolved: pytorch/executorch#5122
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants