Skip to content

Commit c1eddf6

Browse files
z80maniacggerganov
authored andcommitted
server : fix passing prompt as tokens (ggml-org#5955)
* server: fix passing prompt as tokens * Update examples/server/server.cpp --------- Co-authored-by: Georgi Gerganov <[email protected]>
1 parent 005364e commit c1eddf6

File tree

1 file changed

+10
-1
lines changed

1 file changed

+10
-1
lines changed

examples/server/server.cpp

Lines changed: 10 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -852,7 +852,16 @@ struct server_context {
852852
// infill
853853
slot.params.input_prefix = json_value(data, "input_prefix", default_params.input_prefix);
854854
slot.params.input_suffix = json_value(data, "input_suffix", default_params.input_suffix);
855-
slot.prompt = json_value(data, "prompt", std::string(""));
855+
856+
// get prompt
857+
{
858+
const auto & prompt = data.find("prompt");
859+
if (prompt == data.end()) {
860+
slot.prompt = "";
861+
} else {
862+
slot.prompt = *prompt;
863+
}
864+
}
856865

857866
// penalize user-provided tokens
858867
{

0 commit comments

Comments
 (0)