-
Notifications
You must be signed in to change notification settings - Fork 608
Update module wrapper so that params are explicitly registered to the wrapper #10305
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
… wrapper Seeing issue with linear where the fqns for constants disappear. Registering self.method_name as a submodule of wrapper means that the parameters are registered to the wrapper. cc @angelayi ``` File "/data/users/lfq/fbsource/buck-out/v2/gen/fbcode/1af94fa701700343/executorch/test/models/__export_delegated_program__/export_delegated_program#link-tree/torch/export/_trace.py", line 1980, in _export_for_training export_artifact = export_func( File "/data/users/lfq/fbsource/buck-out/v2/gen/fbcode/1af94fa701700343/executorch/test/models/__export_delegated_program__/export_delegated_program#link-tree/torch/export/_trace.py", line 1473, in _strict_export _replace_param_buffer_names(param_buffer_table, export_graph_signature) File "/data/users/lfq/fbsource/buck-out/v2/gen/fbcode/1af94fa701700343/executorch/test/models/__export_delegated_program__/export_delegated_program#link-tree/torch/export/_trace.py", line 272, in _replace_param_buffer_names spec.target = param_buffer_table[spec.target] KeyError: 'L__self___fn___self___linear.weight' ``` Differential Revision: [D73279618](https://our.internmc.facebook.com/intern/diff/D73279618/) [ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/10305
Note: Links to docs will display an error until the docs builds have been completed. ❗ 1 Active SEVsThere are 1 currently active SEVs. If your PR is affected, please view them below: ❌ 2 New Failures, 1 Unrelated FailureAs of commit fc5b3f8 with merge base 06f912d ( NEW FAILURES - The following jobs have failed:
BROKEN TRUNK - The following job failed but were present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
… wrapper Seeing issue with linear where the fqns for constants disappear. Registering self.method_name as a submodule of wrapper means that the parameters are registered to the wrapper. cc angelayi ``` File "/data/users/lfq/fbsource/buck-out/v2/gen/fbcode/1af94fa701700343/executorch/test/models/__export_delegated_program__/export_delegated_program#link-tree/torch/export/_trace.py", line 1980, in _export_for_training export_artifact = export_func( File "/data/users/lfq/fbsource/buck-out/v2/gen/fbcode/1af94fa701700343/executorch/test/models/__export_delegated_program__/export_delegated_program#link-tree/torch/export/_trace.py", line 1473, in _strict_export _replace_param_buffer_names(param_buffer_table, export_graph_signature) File "/data/users/lfq/fbsource/buck-out/v2/gen/fbcode/1af94fa701700343/executorch/test/models/__export_delegated_program__/export_delegated_program#link-tree/torch/export/_trace.py", line 272, in _replace_param_buffer_names spec.target = param_buffer_table[spec.target] KeyError: 'L__self___fn___self___linear.weight' ``` Differential Revision: [D73279618](https://our.internmc.facebook.com/intern/diff/D73279618/) ghstack-source-id: 279019422 Pull Request resolved: #10305
This pull request was exported from Phabricator. Differential Revision: D73279618 |
…ered to the wrapper" Seeing issue with linear where the fqns for constants disappear. Registering self.method_name as a submodule of wrapper means that the parameters are registered to the wrapper. cc angelayi ``` File "/data/users/lfq/fbsource/buck-out/v2/gen/fbcode/1af94fa701700343/executorch/test/models/__export_delegated_program__/export_delegated_program#link-tree/torch/export/_trace.py", line 1980, in _export_for_training export_artifact = export_func( File "/data/users/lfq/fbsource/buck-out/v2/gen/fbcode/1af94fa701700343/executorch/test/models/__export_delegated_program__/export_delegated_program#link-tree/torch/export/_trace.py", line 1473, in _strict_export _replace_param_buffer_names(param_buffer_table, export_graph_signature) File "/data/users/lfq/fbsource/buck-out/v2/gen/fbcode/1af94fa701700343/executorch/test/models/__export_delegated_program__/export_delegated_program#link-tree/torch/export/_trace.py", line 272, in _replace_param_buffer_names spec.target = param_buffer_table[spec.target] KeyError: 'L__self___fn___self___linear.weight' ``` Differential Revision: [D73279618](https://our.internmc.facebook.com/intern/diff/D73279618/) [ghstack-poisoned]
This pull request was exported from Phabricator. Differential Revision: D73279618 |
73f4118
into
gh/lucylq/61/base
… wrapper Pull Request resolved: #10305 Seeing issue with linear where the fqns for constants disappear. Registering self.method_name as a submodule of wrapper means that the parameters are registered to the wrapper. thanks @angelayi for the fix! ``` File "/data/users/lfq/fbsource/buck-out/v2/gen/fbcode/1af94fa701700343/executorch/test/models/__export_delegated_program__/export_delegated_program#link-tree/torch/export/_trace.py", line 1980, in _export_for_training export_artifact = export_func( File "/data/users/lfq/fbsource/buck-out/v2/gen/fbcode/1af94fa701700343/executorch/test/models/__export_delegated_program__/export_delegated_program#link-tree/torch/export/_trace.py", line 1473, in _strict_export _replace_param_buffer_names(param_buffer_table, export_graph_signature) File "/data/users/lfq/fbsource/buck-out/v2/gen/fbcode/1af94fa701700343/executorch/test/models/__export_delegated_program__/export_delegated_program#link-tree/torch/export/_trace.py", line 272, in _replace_param_buffer_names spec.target = param_buffer_table[spec.target] KeyError: 'L__self___fn___self___linear.weight' ``` ghstack-source-id: 279346028 @exported-using-ghexport Differential Revision: [D73279618](https://our.internmc.facebook.com/intern/diff/D73279618/)
… wrapper (#10357) Pull Request resolved: #10305 Seeing issue with linear where the fqns for constants disappear. Registering self.method_name as a submodule of wrapper means that the parameters are registered to the wrapper. thanks @angelayi for the fix! ``` File "/data/users/lfq/fbsource/buck-out/v2/gen/fbcode/1af94fa701700343/executorch/test/models/__export_delegated_program__/export_delegated_program#link-tree/torch/export/_trace.py", line 1980, in _export_for_training export_artifact = export_func( File "/data/users/lfq/fbsource/buck-out/v2/gen/fbcode/1af94fa701700343/executorch/test/models/__export_delegated_program__/export_delegated_program#link-tree/torch/export/_trace.py", line 1473, in _strict_export _replace_param_buffer_names(param_buffer_table, export_graph_signature) File "/data/users/lfq/fbsource/buck-out/v2/gen/fbcode/1af94fa701700343/executorch/test/models/__export_delegated_program__/export_delegated_program#link-tree/torch/export/_trace.py", line 272, in _replace_param_buffer_names spec.target = param_buffer_table[spec.target] KeyError: 'L__self___fn___self___linear.weight' ``` ghstack-source-id: 279346028 @exported-using-ghexport Differential Revision: [D73279618](https://our.internmc.facebook.com/intern/diff/D73279618/)
Stack from ghstack (oldest at bottom):
Seeing issue with linear where the fqns for constants disappear.
Registering self.method_name as a submodule of wrapper means that the parameters are registered to the wrapper. cc @angelayi
Differential Revision: D73279618