Skip to content
This repository was archived by the owner on Jul 1, 2023. It is now read-only.

Adding kullback Leibler Divergence #226

Merged
merged 1 commit into from
Jun 13, 2019
Merged

Conversation

Shashi456
Copy link
Contributor

@Shashi456 Shashi456 commented Jun 13, 2019

build and test passes locally.

Reference #127

@rxwei rxwei merged commit 3a256de into tensorflow:master Jun 13, 2019
@@ -98,6 +98,18 @@ public func poissonLoss<Scalar: TensorFlowFloatingPoint>(
return (predicted - expected * log(predicted)).mean()
}

/// Returns the Kullback-Leibler divergence between predictions and expectations.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should probably say "between expectations and predictions" and also add the actual equation because KL divergence is not symmetric and so this may lead people to thinking the opposite of what's intended.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Shashi456 could you possibly make a new PR addressing this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On it. I'll have it up in a few hours.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great, thanks!

@Shashi456 Shashi456 deleted the kld branch June 14, 2019 04:45
dan12411 pushed a commit to dan12411/swift-apis that referenced this pull request Jun 15, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants