Skip to content

Commit e3d82aa

Browse files
committed
Update on "Support llama3.1"
Summary: Add scaled RoPE Test Plan: Test official checkpoint and gives meaningful result. Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D60129927](https://our.internmc.facebook.com/intern/diff/D60129927) [ghstack-poisoned]
1 parent 468433a commit e3d82aa

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

examples/models/llama2/rope.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -48,8 +48,8 @@ def precompute_freqs_cis(
4848
)
4949
t = torch.arange(end, device=freqs.device) # pyre-ignore
5050
if use_scaled:
51-
freqs = apply_scaling(freqs)
52-
freqs = torch.outer(t, freqs).float() # pyre-ignorex x
51+
freqs = apply_scaling(freqs) # pyre-ignore
52+
freqs = torch.outer(t, freqs).float()
5353
freqs_cos = torch.cos(freqs)
5454
freqs_sin = torch.sin(freqs)
5555
return freqs_cos, freqs_sin

0 commit comments

Comments
 (0)