Skip to content

Commit 9ddd87d

Browse files
committed
Fix: add default stop sequence to chatml chat format
1 parent e985ea2 commit 9ddd87d

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

llama_cpp/llama_chat_format.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -565,7 +565,7 @@ def format_chatml(
565565
_messages = _map_roles(messages, _roles)
566566
_messages.append((_roles["assistant"], None))
567567
_prompt = _format_chatml(system_message, _messages, _sep)
568-
return ChatFormatterResponse(prompt=_prompt)
568+
return ChatFormatterResponse(prompt=_prompt, stop=_sep)
569569

570570

571571
@register_chat_completion_handler("functionary")

0 commit comments

Comments
 (0)