Skip to content
This repository was archived by the owner on Jul 1, 2023. It is now read-only.

Commit 1b3f7c9

Browse files
Shashi456rxwei
authored andcommitted
Add documentation for Kullback-Leibler divergence (#239)
1 parent d42e80f commit 1b3f7c9

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

Sources/TensorFlow/Loss.swift

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -153,7 +153,8 @@ public func poissonLoss<Scalar: TensorFlowFloatingPoint>(
153153
return (predicted - expected * log(predicted)).mean()
154154
}
155155

156-
/// Returns the Kullback-Leibler divergence between predictions and expectations.
156+
/// Returns the Kullback-Leibler divergence (KL divergence) between between expectations and predictions.
157+
/// Given two distributions `p` and `q`, KL divergence computes `(p * log(p / q)).sum()`.
157158
///
158159
/// - Parameters:
159160
/// - predicted: Predicted outputs from a neural network.

0 commit comments

Comments
 (0)