-
Notifications
You must be signed in to change notification settings - Fork 10.5k
[AutoDiff] Type-checking support for inout
parameter differentiation.
#29959
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[AutoDiff] Type-checking support for inout
parameter differentiation.
#29959
Conversation
Let's check that tests pass. |
@@ -1158,6 +1146,42 @@ final class FinalClass: Differentiable { | |||
} | |||
} | |||
|
|||
// Test `inout` parameters. | |||
|
|||
@differentiable(wrt: y) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note: I intentionally didn't add tests for @differentiable(jvp:vjp)
, since that's actively being removed (TF-1162).
Semantically, an `inout` parameter is both a parameter and a result. `@differentiable` and `@derivative` attributes now support original functions with one "semantic result": either a formal result type or an `inout` parameter. The differential/pullback type of a function with `inout` differentiability parameters also has `inout` parameters. This is ideal for performance. Differential typing rules: - Case 1: original function has no `inout` parameters. - Original: `(T0, T1, ...) -> R` - Differential: `(T0.Tan, T1.Tan, ...) -> R.Tan` - Case 2: original function has a non-wrt `inout` parameter. - Original: `(T0, inout T1, ...) -> Void` - Differential: `(T0.Tan, ...) -> T1.Tan` - Case 3: original function has a wrt `inout` parameter. - Original: `(T0, inout T1, ...) -> R` - Differential: `(T0.Tan, inout T1.Tan, ...) -> Void` Pullback typing rules: - Case 1: original function has no `inout` parameters. - Original: `(T0, T1, ...) -> R` - Pullback: `R.Tan -> (T0.Tan, T1.Tan, ...)` - Case 2: original function has a non-wrt `inout` parameter. - Original: `(T0, inout T1, ...) -> Void` - Pullback: `(T1.Tan) -> (T0.Tan, ...)` - Case 3: original function has a wrt `inout` parameter. - Original: `(T0, inout T1, ...) -> R` - Pullback: `(inout T1.Tan) -> (T0.Tan, ...)` Resolves TF-1164.
be7f093
to
44cc330
Compare
@swift-ci Please smoke test |
- `SemanticFunctionResultType` -> `AutoDiffSemanticFunctionResultType` - `getAutoDiffDerivativeFunctionLinearMapResultType` -> `getAutoDiffDerivativeFunctionLinearMapType`
bb68111
to
165051e
Compare
Rename `getOriginalFunctionSemanticResultType` to `getFunctionSemanticResultType`. It is no longer strictly related to "original" functions: scrub all mentions of "original".
I don't think these are correct. |
There are multiple occurrences of |
@swift-ci Please smoke test |
Merge #29959: type-checking support for `inout` parameter differentiation. Intentionally does not fix failing `inout` parameter differentiation tests. Those will be fixed in a separate PR to faciliate code review. ``` Failing Tests (3): Swift(linux-x86_64) :: AutoDiff/downstream/forward_mode_diagnostics.swift Swift(linux-x86_64) :: AutoDiff/downstream/activity_analysis.swift Swift(linux-x86_64) :: AutoDiff/downstream/differentiation_transform_diagnostics.swift ``` Lesson learned: next time when doing AutoDiff changes that touch both upstream and downstream code, create PRs to both `tensorflow` and `master` branch, and merge the `tensorflow` branch PR first. This minimizes conflicts during the `tensorflow -> master` merge process.
Cherry-pick #29959: AST and SIL typing rules for `inout` parameter differentiation. --- Add reverse-mode differentiation support for `apply` with `inout` arguments. Notable pullback generation changes: - If the pullback seed argument is `inout`, assign it (rather than a copy) directly as the adjoint buffer of the original result. This is important so the value is updated in-place. - In `visitApplyInst`: skip adjoint accumulation for `inout` arguments. Adjoint accumulation for `inout` arguments occurs when callee pullbacks are applied, so no extra accumulation is necessary. Add derivatives for functions with `inout` parameters in the stdlib for testing: - `FloatingPoint` operations: `+=`, `-=`, `*=`, `/=` - `Array.append` Resolves TF-1165. Todos: - Add more tests, e.g. SILGen tests for `inout` derivative typing rules. - Evaluate performance of `inout` derivatives vs functional derivatives + mutation. - TF-1166: enable `@differentiable` attribute on `set` accessors. - TF-1173: add forward-mode differentiation support for `apply` with `inout` parameters. Exposes TF-1175: incorrect activity for class arguments. Exposes TF-1176: incorrect activity for class `modify` accessors. Add negative tests.
Semantically, an
inout
parameter is both a parameter and a result.@differentiable
and@derivative
attributes now support original functionswith one "semantic result": either a formal result or an
inout
parameter.The differential/pullback type of a function with
inout
differentiabilityparameters also has
inout
parameters. This is ideal for performance.Differential typing rules:
inout
parameters.(T0, T1, ...) -> R
(T0.Tan, T1.Tan, ...) -> R.Tan
inout
parameter.(T0, inout T1, ...) -> Void
(T0.Tan, ...) -> T1.Tan
inout
parameter.(T0, inout T1, ...) -> Void
(T0.Tan, inout T1.Tan, ...) -> Void
Pullback typing rules:
inout
parameters.(T0, T1, ...) -> R
R.Tan -> (T0.Tan, T1.Tan, ...)
inout
parameter.(T0, inout T1, ...) -> Void
(T1.Tan) -> (T0.Tan, ...)
inout
parameter.(T0, inout T1, ...) -> Void
(inout T1.Tan) -> (T0.Tan, ...)
Resolves TF-1164.
Examples:
This patch includes all
inout
parameter differentiation changes currentlyrelevant to
master
branch, e.g. SIL derivative type calculation changes.Remaining changes involves code that hasn't yet been upstreamed. Those changes
will be committed to
tensorflow
branch and eventually upstreamed.End-to-end
inout
parameter differentiation is tested in #29956.