Skip to content

[AutoDiff] Minor @differentiable attribute type-checking fix. #29477

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jan 27, 2020

Conversation

dan-zheng
Copy link
Contributor

Use derivative canonical generic signature to get the derivative generic
environment, not the other way around.

Fixes @differentiable attribute SILGen assertion failures on tensorflow
branch. Exposed and fixed on tensorflow branch in #29347.

Use derivative canonical generic signature to get the derivative generic
environment, not the other way around.

Fixes `@differentiable` attribute SILGen assertion failures on `tensorflow`
branch.
@dan-zheng dan-zheng requested review from rxwei and marcrasi January 27, 2020 19:37
@dan-zheng
Copy link
Contributor Author

@swift-ci Please smoke test and merge

@compnerd
Copy link
Member

@dan-zheng could you please add a test case that would previous trigger the assertion?

@dan-zheng
Copy link
Contributor Author

@dan-zheng could you please add a test case that would previous trigger the assertion?

I actually cannot add tests on master branch because @differentiable attribute SILGen (where the assertion occurs) hasn't been upstreamed yet. Tests for the assertion already exist on tensorflow branch.

This PR just brings code in-sync with tensorflow branch after #29347. I'll be sure add tests to PRs when possible!

@swift-ci swift-ci merged commit 9274e86 into swiftlang:master Jan 27, 2020
@dan-zheng dan-zheng deleted the autodiff-upstream-diff-attr branch January 27, 2020 22:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants