Skip to content

[AutoDiff] Make force-unwrapping differentiable. #26826

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Aug 25, 2019

Conversation

rxwei
Copy link
Contributor

@rxwei rxwei commented Aug 25, 2019

Optional force-unwrapping is a mathematically transposable operation. This patch makes force-unwrapping differentiable by applying its transpose on adjoint buffers.

func bla<T: Differentiable & FloatingPoint>(_ t: T) -> (T, Float) where T == T.TangentVector {
    gradient(at: t, Float(1)) { (x, y) in (x as! Float) * y }
}
print(bla(Float(2))) // (1, 2)

Resolves TF-455.

@rxwei rxwei added the tensorflow This is for "tensorflow" branch PRs. label Aug 25, 2019
@rxwei rxwei requested review from dan-zheng and marcrasi August 25, 2019 05:18
@rxwei
Copy link
Contributor Author

rxwei commented Aug 25, 2019

@swift-ci please test tensorflow

@rxwei
Copy link
Contributor Author

rxwei commented Aug 25, 2019

@swift-ci please test tensorflow Linux

@rxwei rxwei merged commit cf52777 into swiftlang:tensorflow Aug 25, 2019
@rxwei rxwei deleted the force-unwrapping-differentiable branch August 25, 2019 06:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
tensorflow This is for "tensorflow" branch PRs.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants