Skip to content

[Sema] [AutoDiff] Derive @_fieldwiseProductSpace when tangent/cotangent is Self #21737

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Jan 9, 2019

Conversation

rxwei
Copy link
Contributor

@rxwei rxwei commented Jan 9, 2019

In Differentiable derived conformances, fix a bug where @_fieldwiseProductSpace is not added to derived typealiases when TangentVector and/or CotangentVector are inferred as Self. Without this change, field access x.a could not be differentiated.

struct AdditiveTangentIsSelf : AdditiveArithmetic, Differentiable {
  var a: Float
}
let _: @autodiff (AdditiveTangentIsSelf) -> Float = { x in
  x.a + x.a
}

@rxwei rxwei requested a review from dan-zheng January 9, 2019 10:48
@rxwei rxwei added the tensorflow This is for "tensorflow" branch PRs. label Jan 9, 2019
@rxwei
Copy link
Contributor Author

rxwei commented Jan 9, 2019

@swift-ci please test tensorflow

5 similar comments
@rxwei
Copy link
Contributor Author

rxwei commented Jan 9, 2019

@swift-ci please test tensorflow

@rxwei
Copy link
Contributor Author

rxwei commented Jan 9, 2019

@swift-ci please test tensorflow

@rxwei
Copy link
Contributor Author

rxwei commented Jan 9, 2019

@swift-ci please test tensorflow

@rxwei
Copy link
Contributor Author

rxwei commented Jan 9, 2019

@swift-ci please test tensorflow

@rxwei
Copy link
Contributor Author

rxwei commented Jan 9, 2019

@swift-ci please test tensorflow

@dan-zheng
Copy link
Contributor

@swift-ci Please test tensorflow

2 similar comments
@dan-zheng
Copy link
Contributor

@swift-ci Please test tensorflow

@dan-zheng
Copy link
Contributor

@swift-ci Please test tensorflow

@rxwei
Copy link
Contributor Author

rxwei commented Jan 9, 2019

@swift-ci please test tensorflow

@rxwei
Copy link
Contributor Author

rxwei commented Jan 9, 2019

@swift-ci please test tensorflow macOS

Copy link
Contributor

@dan-zheng dan-zheng left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch!

@@ -374,6 +374,7 @@ ERROR(autodiff_unsupported_type,none,
"differentiating '%0' is not supported yet", (Type))
ERROR(autodiff_function_not_differentiable,none,
"function is not differentiable", ())
// TODO: Change this to a note.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you want to change this to a note?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, here's why. The DifferentiationInvoker infra keeps track of how each differentiation task is created. Only the diagnostic on the very bottom of the invocation stack should be an error.

func f0(x) {
  f1()
}
func f1(x) {
  f2(x)
}
func f2(x) {
  f3(x) // non-differentiable
}
derivative(of: f0)

If expression f3() is not differentiable, we should not emit an error at f3() because it's not the place where the user requested the derivative and the user should not be told to change the code here before knowing what's actually causing differentiation to occur. Instead, we emit notes along the invocation stack until we reach the user's differentiation request, which is always the bottom of the invocation stack.

func f0(x) {
  f1() // note: when differentiating this function call
}
func f1(x) {
  f2(x) // note: when differentiating this function call
}
func f2(x) {
  f3(x) // note: expression is not differentiable
}
derivative(of: f0) // error: f0 is not differentiable

As to why this cannot be changed to a note today: We used to have DifferentialOperator as a case in DifferentiationInvoker, and the differential operator was, as you know, #gradient. When we switched over to generalized differentiability™, there is no more compiler-known differential operator -- function conversion to @autodiff is the only formal way to trigger differentiation in expressions. We didn't add a case for function conversion in DifferentiationInvoker, so the user request is not recorded, and there is no error at the user request. If we change this error to a note today, the pass pipeline would assume there's no error, keep running other passes on partially/incorrectly transformed code, and cause crashers or undefined runtime behavior.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PS: This definitely needs to be fixed. Let me know if you wanna make diagnostics better :)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the detailed info! Definitely interested in improving AD diagnostics sometime.

@rxwei
Copy link
Contributor Author

rxwei commented Jan 9, 2019

@swift-ci please test tensorflow

1 similar comment
@rxwei
Copy link
Contributor Author

rxwei commented Jan 9, 2019

@swift-ci please test tensorflow

@dan-zheng
Copy link
Contributor

@swift-ci Please test tensorflow

The crucial line is copying formal access from nominal, which should
fix the failing test.
@dan-zheng
Copy link
Contributor

@swift-ci Please test tensorflow

@rxwei rxwei merged commit 1db433c into swiftlang:tensorflow Jan 9, 2019
@rxwei rxwei deleted the struct-extract-diff branch January 9, 2019 20:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
tensorflow This is for "tensorflow" branch PRs.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants