Skip to content

Commit 8971a0f

Browse files
ngxsontybalex
authored andcommitted
Server: clean up OAI params parsing function (ggml-org#6284)
* server: clean up oai parsing function * fix response_format * fix empty response_format * minor fixes * add TODO for logprobs * update docs
1 parent 546938d commit 8971a0f

File tree

1 file changed

+6
-8
lines changed

1 file changed

+6
-8
lines changed

examples/server/utils.hpp

Lines changed: 6 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -745,8 +745,6 @@ static json oaicompat_completion_params_parse(
745745
llama_params["temperature"] = json_value(body, "temperature", 0.0);
746746
llama_params["top_p"] = json_value(body, "top_p", 1.0);
747747

748-
// Apply chat template to the list of messages
749-
llama_params["prompt"] = format_chat(model, chat_template, body["messages"]);
750748

751749
// Handle "stop" field
752750
if (body.contains("stop") && body["stop"].is_string()) {
@@ -785,12 +783,12 @@ static json oaicompat_completion_params_parse(
785783
}
786784

787785
// Params supported by OAI but unsupported by llama.cpp
788-
static const std::vector<std::string> unsupported_params { "tools", "tool_choice" };
789-
for (auto & param : unsupported_params) {
790-
if (body.contains(param)) {
791-
throw std::runtime_error("Unsupported param: " + param);
792-
}
793-
}
786+
// static const std::vector<std::string> unsupported_params { "tools", "tool_choice" };
787+
// for (auto & param : unsupported_params) {
788+
// if (body.contains(param)) {
789+
// throw std::runtime_error("Unsupported param: " + param);
790+
// }
791+
// }
794792

795793
// Copy remaining properties to llama_params
796794
// This allows user to use llama.cpp-specific params like "mirostat", "tfs_z",... via OAI endpoint.

0 commit comments

Comments
 (0)