-
Notifications
You must be signed in to change notification settings - Fork 608
Applied quantization for linear with bias=True in pre_quantization #9472
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Summary: Applied quantization for linear with bias=True in pre_quantization and checkpoint conversion. Verified small checkpoint of speech encoder eager model. Differential Revision: D71573144
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/9472
Note: Links to docs will display an error until the docs builds have been completed. ❌ 2 New FailuresAs of commit 981d066 with merge base 1a9a59b ( NEW FAILURES - The following jobs have failed:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This pull request was exported from Phabricator. Differential Revision: D71573144 |
@pytorchbot label "topic: not user facing" |
@YIWENX14 looks like this pr breaks oss ci, can you take a look? |
@Gasoonjia which one? |
![]() @larryliu0820 this one |
@Gasoonjia I don't think this PR is causing this failure |
@larryliu0820 oh great thanks for confirm that. Just from the ci board looks like this pr introducing the issue. |
Differential Revision: D71573144 Pull Request resolved: pytorch#9472
Differential Revision: D71573144 Pull Request resolved: pytorch#9472
Summary: Applied quantization for linear with bias=True in pre_quantization and checkpoint conversion. Verified small checkpoint of speech encoder eager model.
Differential Revision: D71573144