Skip to content

Commit e87347d

Browse files
committed
up
1 parent 1652a15 commit e87347d

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

examples/apple/coreml/llama/llama_transformer.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -151,7 +151,8 @@ def _norm(self, x):
151151
"""
152152
x_max, _ = torch.abs(x).max(-1, keepdim=True)
153153
x = x / x_max # This makes the op more stable in FP16
154-
return x * torch.rsqrt((x * x).mean(-1, keepdim=True) + self.eps)
154+
eps = self.eps / (x_max * x_max)
155+
return x * torch.rsqrt((x * x).mean(-1, keepdim=True) + eps)
155156

156157
def forward(self, x):
157158
"""

0 commit comments

Comments
 (0)