-
Notifications
You must be signed in to change notification settings - Fork 137
Conversation
One question: what should I call this? Is |
I feel it's nice if we can add citations to the papers that introduced the various loss functions, layers, etc., whenever they're not one of the super standard/common ones. |
@tanmayb123 Could you add a citation to this paper and also add the functional form of the loss in the documentation string? |
Sure thing, but @rxwei had decided some time ago not to include loss functions that aren't available in Keras, but this PR stayed open. What should I do? |
Let's add it for now! |
@@ -53,3 +53,21 @@ public func sigmoidCrossEntropy<Scalar: TensorFlowFloatingPoint>( | |||
(Tensor<Scalar>(1) - labels) * log(Tensor<Scalar>(1) - logits) | |||
return -loss.mean(alongAxes: 0).sum() | |||
} | |||
|
|||
/// Computes the triplet loss. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/// Computes the triplet loss. | |
/// Returns the triplet loss. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also add the citation as Anthony suggested.
Hi @tanmayb123 ! It looks like this PR is coming from an unknown repository. As a result, we'll close this out for now. But if you'd like to try and address the comments and add a test, I think we'd love to get this merged in. All the best, |
@tanmayb123 should I borrow your code and submit a new PR with requested changes? I was planning to submit a PR on your fork but seems like the branch having this code is missing. |
No description provided.