Skip to content

Restore constant segment #5141

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Sep 10, 2024
Merged

Conversation

lucylq
Copy link
Contributor

@lucylq lucylq commented Sep 6, 2024

Summary:
Restore constant segment in deserialize_pte_binary.

Note that programs are not identical afterwards, as we do not store the size of the constant buffer. Instead, the restored program will contain tensor+padding in each buffer.

Differential Revision: D62278416

Copy link

pytorch-bot bot commented Sep 6, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/5141

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 1eb730a with merge base b69ae0c (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Sep 6, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D62278416

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D62278416

lucylq added a commit to lucylq/executorch-1 that referenced this pull request Sep 9, 2024
Summary:
Pull Request resolved: pytorch#5141

Restore constant segment in deserialize_pte_binary.

Note that programs are not identical afterwards, as we do not store the size of the constant buffer. Instead, the restored program will contain tensor+padding in each buffer.

Reviewed By: dbort

Differential Revision: D62278416
lucylq added a commit to lucylq/executorch-1 that referenced this pull request Sep 9, 2024
Summary:
Pull Request resolved: pytorch#5141

Restore constant segment in deserialize_pte_binary.

Note that programs are not identical afterwards, as we do not store the size of the constant buffer. Instead, the restored program will contain tensor+padding in each buffer.

Reviewed By: dbort

Differential Revision: D62278416
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D62278416

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D62278416

lucylq added a commit to lucylq/executorch-1 that referenced this pull request Sep 9, 2024
Summary:
Pull Request resolved: pytorch#5141

Restore constant segment in deserialize_pte_binary.

Note that programs are not identical afterwards, as we do not store the size of the constant buffer. Instead, the restored program will contain tensor+padding in each buffer.

Reviewed By: dbort

Differential Revision: D62278416
Summary:
Pull Request resolved: pytorch#5141

Restore constant segment in deserialize_pte_binary.

Note that programs are not identical afterwards, as we do not store the size of the constant buffer. Instead, the restored program will contain tensor+padding in each buffer.

Reviewed By: dbort

Differential Revision: D62278416
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D62278416

@facebook-github-bot facebook-github-bot merged commit 549f14b into pytorch:main Sep 10, 2024
36 of 38 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants