Skip to content
This repository was archived by the owner on Jul 1, 2023. It is now read-only.

Commit a74064d

Browse files
Shashi456rxwei
authored andcommitted
Documentation for Squaredhingeloss and linting in losstests (#229)
1 parent fb3135c commit a74064d

File tree

2 files changed

+6
-1
lines changed

2 files changed

+6
-1
lines changed

Sources/TensorFlow/Loss.swift

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -65,6 +65,11 @@ public func hingeLoss<Scalar: TensorFlowFloatingPoint>(
6565
return max(Tensor(1) - expected * predicted, Tensor(0)).mean()
6666
}
6767

68+
/// Returns the squared hinge loss between predictions and expectations.
69+
///
70+
/// - Parameters:
71+
/// - predicted: Predicted outputs from a neural network.
72+
/// - expected: Expected values, i.e. targets, that correspond to the correct output.
6873
@differentiable(wrt: predicted)
6974
public func squaredHingeLoss<Scalar: TensorFlowFloatingPoint>(
7075
predicted: Tensor<Scalar>, expected: Tensor<Scalar>

Tests/TensorFlowTests/LossTests.swift

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -201,7 +201,7 @@ final class LossTests: XCTestCase {
201201
("testHingeLoss", testHingeLoss),
202202
("testCategoricalHingeLoss", testCategoricalHingeLoss),
203203
("testSquaredHingeLoss", testSquaredHingeLoss),
204-
("testPoissonLoss",testPoissonLoss),
204+
("testPoissonLoss", testPoissonLoss),
205205
("testSoftmaxCrossEntropyWithProbabilitiesLoss",
206206
testSoftmaxCrossEntropyWithProbabilitiesLoss),
207207
("testSoftmaxCrossEntropyWithProbabilitiesGrad",

0 commit comments

Comments
 (0)