Skip to content

Fix quantization for input to reference model #2317

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 2 commits into from

Conversation

per
Copy link
Collaborator

@per per commented Mar 8, 2024

Add the zerpoint instead of subtracting. This worked since the tests so far used the ones as inputs which quantize to a zp of -128 which gives the same np.int8 result in both cases since the int8 wraps. Also needs to round and clip the scaled values to the int8 range.

Signed-off-by: Per Åstrand [email protected]

Copy link

pytorch-bot bot commented Mar 8, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/2317

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (3 Unrelated Failures)

As of commit f1b2bf8 with merge base f9cad4e (image):

FLAKY - The following jobs failed but were likely due to flakiness present on trunk:

BROKEN TRUNK - The following job failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

  • trunk / test-coreml-delegate / macos-job (gh)
    /Users/runner/work/executorch/executorch/pytorch/executorch/backends/apple/coreml/runtime/delegate/ETCoreMLModelManager.mm:387:43: error: no member named 'ModelPackage' in '(anonymous namespace)::ModelAssetType'

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Mar 8, 2024
@per per added the partner: arm For backend delegation, kernels, demo, etc. from the 3rd-party partner, Arm label Mar 8, 2024
@per per requested review from freddan80 and digantdesai March 8, 2024 14:10
@per
Copy link
Collaborator Author

per commented Mar 8, 2024

Will fix the double decorators causing the failures.

self._test_add_tosa_BI_pipeline(self.Add2(), test_data)

@unittest.skipIf(
not VELA_INSTALLED,
"There is no point in running U55 tests if the Vela tool is not installed",
)
def test_add2_u55_BI(self):
test_data = (torch.ones(1, 1, 4, 4), torch.ones(1, 1, 4, 1))
@parameterized.expand(Add2.test_parameters)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@parameterized and @UnitTest causing failures.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

still an issue?

@per per marked this pull request as draft March 8, 2024 15:19
@per per force-pushed the reference_model_output branch from eac5f44 to 63bd288 Compare March 13, 2024 13:14
@per per marked this pull request as ready for review March 13, 2024 13:16
@per per force-pushed the reference_model_output branch 3 times, most recently from c9ff559 to 8b51570 Compare March 20, 2024 16:06
per added 2 commits March 20, 2024 19:42
Add the zerpoint instead of subtracting. This worked since the tests so
far used the ones as inputs which quantize to a zp of -128 which gives
the same np.int8 result in both cases since the int8 wraps.
Also needs to round and clip the scaled values to the int8 range.

Signed-off-by: Per Åstrand <[email protected]>

Change-Id: Ideaed6d072a4065573b38fb7476c7dbe8ba814fd
Fix order of decorators to expand unittest first, and then parameterized input.
Fix bug in add operator conversion to handle different scales correctly.

Signed-off-by: Per Åstrand <[email protected]>

Change-Id: Ic228cf0215e8171392776739936a53c025802fd5
@per per force-pushed the reference_model_output branch from 8b51570 to f1b2bf8 Compare March 20, 2024 18:43
Copy link
Contributor

@digantdesai digantdesai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

self._test_add_tosa_BI_pipeline(self.Add2(), test_data)

@unittest.skipIf(
not VELA_INSTALLED,
"There is no point in running U55 tests if the Vela tool is not installed",
)
def test_add2_u55_BI(self):
test_data = (torch.ones(1, 1, 4, 4), torch.ones(1, 1, 4, 1))
@parameterized.expand(Add2.test_parameters)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

still an issue?

@facebook-github-bot
Copy link
Contributor

@digantdesai has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@digantdesai merged this pull request in d06ccd2.

@per per deleted the reference_model_output branch November 15, 2024 11:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ciflow/trunk CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. Merged partner: arm For backend delegation, kernels, demo, etc. from the 3rd-party partner, Arm
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants