Skip to content
This repository was archived by the owner on Aug 7, 2024. It is now read-only.

Commit fee25cf

Browse files
author
Andrew Gu
committed
Set amax_and_scale_synced unconditionally
ghstack-source-id: a0b3638 Pull Request resolved: #220
1 parent 746519f commit fee25cf

File tree

1 file changed

+2
-4
lines changed

1 file changed

+2
-4
lines changed

float8_experimental/float8_linear_utils.py

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -281,7 +281,5 @@ def sync_float8_amax_and_scale_history(model: torch.nn.Module, fp8_layers=None)
281281
child.fp8_scale_w.copy_(new_w_scales[idx])
282282
child.fp8_scale_dL_dY.copy_(new_dL_dY_scales[idx])
283283

284-
# 4. set a flag to signal amaxes/scales are ready
285-
# We only update the flag if we know it will be checked by the modules
286-
if fp8_config.enable_amax_init:
287-
child.amax_and_scale_synced = True
284+
# Set a flag to signal amaxes/scales are ready
285+
child.amax_and_scale_synced = True

0 commit comments

Comments
 (0)