Skip to content
This repository was archived by the owner on Jul 1, 2023. It is now read-only.

Commit 7993ba0

Browse files
jon-towShashi456
andcommitted
Reword documentation summary
Co-Authored-By: Pawan Sasanka Ammanamanchi <[email protected]>
1 parent 904363d commit 7993ba0

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

Sources/TensorFlow/Operators/Math.swift

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1051,7 +1051,7 @@ func _vjpRelu<T: TensorFlowFloatingPoint>(
10511051
(relu(x), { v in Tensor(x .> 0) * v })
10521052
}
10531053

1054-
/// Returns the Gaussian Error Linear Unit (GELU) of the specified tensor element-wise.
1054+
/// Returns a tensor by applying the Gaussian Error Linear Unit (GELU) to the specified tensor element-wise.
10551055
///
10561056
/// Specifically, `gelu` approximates `xP(X <= x)`, where `P(X <= x)` is the Standard Gaussian
10571057
/// cumulative distribution, by computing: x * [0.5 * (1 + tanh[√(2/π) * (x + 0.044715 * x^3)])].

0 commit comments

Comments
 (0)