Skip to content

[AutoDiff] Add loop differentiation negative testcases. #27796

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Oct 19, 2019

Conversation

dan-zheng
Copy link
Contributor

@dan-zheng dan-zheng commented Oct 19, 2019

Loop differentiation produces incorrect results when the reduction accumulation
variable is not initialized with an active parameter, e.g. when using
var result = 1 instead of var result = x.

func for_loop_nonactive_initial_value(_ x: Float) -> Float {
  var result: Float = 1
  for _ in 0..<2 {
    result = result * x
  }
  return result
}
print(valueWithGradient(at: 3, in: for_loop_nonactive_initial_value))
//   Actual: (value: 9.0, gradient: 3.0)
// Expected: (value: 9.0, gradient: 6.0)

TF-933 tracks this issue. This patch add negative testcases (currently failing).


Workarounds are to reformulate using var result = x or by using
Array.differentiableReduce:

// Workaround 1: use `var result = x` instead of `var result = 1`.
func for_loop_active_initial_value(_ x: Float) -> Float {
  var result = x
  for _ in 0..<1 {
    result = result * x
  }
  return result
}
print(valueWithGradient(at: 3, in: for_loop_active_initial_value))
// (value: 9.0, gradient: 6.0)
// Workaround 2: use `Array.differentiableReduce`.
func power(_ x: Double, _ exponent: Int) -> Double {
  // Array allocation isn't efficient, but `Array.differentiableReduce` has correct derivatives.
  let array = Array(repeating: x, count: exponent)
  return array.differentiableReduce(1, *)
}
print(valueWithGradient(at: 3, in: { x in power(x, 2) }))
// (value: 9.0, gradient: 6.0)

With 0-based indexing, the final iteration count is easier to understand.
Example: `for _ in 0..<2 { ... }`: the loop will run 2 times.
Loop differentiation produces incorrect results when the reduction accumulation
variable is not initialized with an active parameter, e.g. when using
`var result = 1` instead of `var result = x`.

```
func for_loop_nonactive_initial_value(_ x: Float) -> Float {
  var result: Float = 1
  for _ in 0..<2 {
    result = result * x
  }
  return result
}
print(valueWithGradient(at: 3, in: for_loop_nonactive_initial_value))
//   Actual: (value: 9.0, gradient: 3.0)
// Expected: (value: 9.0, gradient: 6.0)
```

TF-933 tracks this issue. This patch add negative testcases (currently failing).

Workarounds are to reformulate using `var result = x` or by using
`Array.differentiableReduce`.
@dan-zheng dan-zheng added the tensorflow This is for "tensorflow" branch PRs. label Oct 19, 2019
@dan-zheng dan-zheng requested review from rxwei and marcrasi October 19, 2019 22:18
@dan-zheng
Copy link
Contributor Author

@swift-ci Please test tensorflow

@dan-zheng dan-zheng merged commit 519236c into swiftlang:tensorflow Oct 19, 2019
@dan-zheng dan-zheng deleted the TF-933 branch October 19, 2019 23:41
dan-zheng added a commit to dan-zheng/swift that referenced this pull request Oct 30, 2019
Loop differentiation produces incorrect results when the reduction accumulation
variable is not initialized with an active parameter. TF-933 tracks this issue.

This patch is a follow-up to swiftlang#27796, adding
negative test cases for while and repeat-while loops.
dan-zheng added a commit that referenced this pull request Oct 31, 2019
… (#27973)

Loop differentiation produces incorrect results when the reduction accumulation
variable is not initialized with an active parameter. TF-933 tracks this issue.

This patch is a follow-up to #27796, adding
negative test cases for while and repeat-while loops.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
tensorflow This is for "tensorflow" branch PRs.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant