Skip to content

Reland cadence quantized_linear_per_tensor_out cpu 1eb924f^..fd33294 #7204

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Dec 5, 2024

Conversation

zonglinpeng
Copy link
Contributor

@zonglinpeng zonglinpeng commented Dec 5, 2024

Reverts #7203

Test plan
python3 -m examples.cadence.models.babyllama

Copy link

pytorch-bot bot commented Dec 5, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/7204

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit d13ce8d with merge base a9565aa (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Dec 5, 2024
@kirklandsign kirklandsign changed the title Revert "Revert cadence quantized_linear_per_tensor_out cpu1eb924f^..fd33294" Reland cadence quantized_linear_per_tensor_out cpu 1eb924f^..fd33294 Dec 5, 2024
@pytorch-bot pytorch-bot bot added the ci-no-td label Dec 5, 2024
@zonglinpeng zonglinpeng merged commit 78b60df into main Dec 5, 2024
41 of 44 checks passed
@zonglinpeng zonglinpeng deleted the revert-7203-revert-cadence branch December 5, 2024 20:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ci-no-td CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. topic: not user facing
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants