Skip to content

Commit 8bd161b

Browse files
Shashi456dan12411
authored andcommitted
Add documentation for Kullback-Leibler divergence (tensorflow#239)
1 parent ea0f36a commit 8bd161b

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

Sources/TensorFlow/Loss.swift

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -153,7 +153,8 @@ public func poissonLoss<Scalar: TensorFlowFloatingPoint>(
153153
return (predicted - expected * log(predicted)).mean()
154154
}
155155

156-
/// Returns the Kullback-Leibler divergence between predictions and expectations.
156+
/// Returns the Kullback-Leibler divergence (KL divergence) between between expectations and predictions.
157+
/// Given two distributions `p` and `q`, KL divergence computes `(p * log(p / q)).sum()`.
157158
///
158159
/// - Parameters:
159160
/// - predicted: Predicted outputs from a neural network.

0 commit comments

Comments
 (0)