Skip to content

Commit 9540255

Browse files
authored
llama-chat : fix multiple system message for gemma, orion (#14246)
1 parent 3865cff commit 9540255

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

src/llama-chat.cpp

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -333,7 +333,7 @@ int32_t llm_chat_apply_template(
333333
std::string role(message->role);
334334
if (role == "system") {
335335
// there is no system message for gemma, but we will merge it with user prompt, so nothing is broken
336-
system_prompt = trim(message->content);
336+
system_prompt += trim(message->content);
337337
continue;
338338
}
339339
// in gemma, "assistant" is "model"
@@ -355,7 +355,7 @@ int32_t llm_chat_apply_template(
355355
std::string role(message->role);
356356
if (role == "system") {
357357
// there is no system message support, we will merge it with user prompt
358-
system_prompt = message->content;
358+
system_prompt += message->content;
359359
continue;
360360
} else if (role == "user") {
361361
ss << "Human: ";

0 commit comments

Comments
 (0)