You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Loop differentiation produces incorrect results when the reduction accumulation
variable is not initialized with an active parameter, e.g. when using
`var result = 1` instead of `var result = x`.
```
func for_loop_nonactive_initial_value(_ x: Float) -> Float {
var result: Float = 1
for _ in 0..<2 {
result = result * x
}
return result
}
print(valueWithGradient(at: 3, in: for_loop_nonactive_initial_value))
// Actual: (value: 9.0, gradient: 3.0)
// Expected: (value: 9.0, gradient: 6.0)
```
TF-933 tracks this issue. This patch add negative testcases (currently failing).
Workarounds are to rewrite using `var result = x` or by using
`Array.differentiableReduce`.
0 commit comments