You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: examples/server/README.md
+3-3Lines changed: 3 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -151,7 +151,7 @@ node index.js
151
151
152
152
`temperature`: Adjust the randomness of the generated text (default: 0.8).
153
153
154
-
`dynatemp_range`: Dynamic temperature range (default: 0.0, 0.0 = disabled).
154
+
`dynatemp_range`: Dynamic temperature range. The final temperature will be in the range of `[temperature - dynatemp_range; temperature + dynatemp_range]` (default: 0.0, 0.0 = disabled).
155
155
156
156
`dynatemp_exponent`: Dynamic temperature exponent (default: 1.0).
157
157
@@ -209,7 +209,7 @@ node index.js
209
209
210
210
`slot_id`: Assign the completion task to an specific slot. If is -1 the task will be assigned to a Idle slot (default: -1)
211
211
212
-
`cache_prompt`: Save the prompt and generation for avoid reprocess entire prompt if a part of this isn't change (default: false)
212
+
`cache_prompt`: Re-use previously cached prompt from the last request if possible. This may prevent re-caching the prompt from scratch. (default: false)
213
213
214
214
`system_prompt`: Change the system prompt (initial prompt of all slots), this is useful for chat applications. [See more](#change-system-prompt-on-runtime)
215
215
@@ -242,7 +242,7 @@ Notice that each `probs` is an array of length `n_probs`.
242
242
243
243
-`content`: Completion result as a string (excluding `stopping_word` if any). In case of streaming mode, will contain the next token as a string.
244
244
-`stop`: Boolean for use with `stream` to check whether the generation has stopped (Note: This is not related to stopping words array `stop` from input options)
245
-
-`generation_settings`: The provided options above excluding `prompt` but including `n_ctx`, `model`
245
+
-`generation_settings`: The provided options above excluding `prompt` but including `n_ctx`, `model`. These options may differ from the original ones in some way (e.g. bad values filtered out, strings converted to tokens, etc.).
246
246
-`model`: The path to the model loaded with `-m`
247
247
-`prompt`: The provided `prompt`
248
248
-`stopped_eos`: Indicating whether the completion has stopped because it encountered the EOS token
0 commit comments