Skip to content

Commit 421288b

Browse files
committed
Comment / lint
1 parent 5834d14 commit 421288b

File tree

1 file changed

+1
-3
lines changed

1 file changed

+1
-3
lines changed

extension/llm/tokenizer/utils.py

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -17,12 +17,10 @@ def get_tokenizer(tokenizer_path: str, tokenizer_config_path: Optional[str] = No
1717
if tokenizer_path.endswith(".json"):
1818
from tokenizers import Tokenizer
1919

20-
# Load the tokenizer from the tokenizer.json file
2120
tokenizer = Tokenizer.from_file(tokenizer_path)
2221

23-
# export_llama expects n_words attribute.
24-
tokenizer.n_words = tokenizer.get_vocab_size()
2522
# Keep in line with internal tokenizer apis.
23+
tokenizer.n_words = tokenizer.get_vocab_size()
2624
tokenizer.decode_token = lambda token: tokenizer.decode([token])
2725
original_encode = tokenizer.encode
2826
tokenizer.encode = lambda prompt, **kwargs: original_encode(prompt).ids

0 commit comments

Comments
 (0)