Skip to content

Commit c5688c6

Browse files
authored
server : clarify some params in the docs (#5640)
1 parent 4ef245a commit c5688c6

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

examples/server/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -151,7 +151,7 @@ node index.js
151151

152152
`temperature`: Adjust the randomness of the generated text (default: 0.8).
153153

154-
`dynatemp_range`: Dynamic temperature range (default: 0.0, 0.0 = disabled).
154+
`dynatemp_range`: Dynamic temperature range. The final temperature will be in the range of `[temperature - dynatemp_range; temperature + dynatemp_range]` (default: 0.0, 0.0 = disabled).
155155

156156
`dynatemp_exponent`: Dynamic temperature exponent (default: 1.0).
157157

@@ -209,7 +209,7 @@ node index.js
209209

210210
`slot_id`: Assign the completion task to an specific slot. If is -1 the task will be assigned to a Idle slot (default: -1)
211211

212-
`cache_prompt`: Save the prompt and generation for avoid reprocess entire prompt if a part of this isn't change (default: false)
212+
`cache_prompt`: Re-use previously cached prompt from the last request if possible. This may prevent re-caching the prompt from scratch. (default: false)
213213

214214
`system_prompt`: Change the system prompt (initial prompt of all slots), this is useful for chat applications. [See more](#change-system-prompt-on-runtime)
215215

@@ -242,7 +242,7 @@ Notice that each `probs` is an array of length `n_probs`.
242242

243243
- `content`: Completion result as a string (excluding `stopping_word` if any). In case of streaming mode, will contain the next token as a string.
244244
- `stop`: Boolean for use with `stream` to check whether the generation has stopped (Note: This is not related to stopping words array `stop` from input options)
245-
- `generation_settings`: The provided options above excluding `prompt` but including `n_ctx`, `model`
245+
- `generation_settings`: The provided options above excluding `prompt` but including `n_ctx`, `model`. These options may differ from the original ones in some way (e.g. bad values filtered out, strings converted to tokens, etc.).
246246
- `model`: The path to the model loaded with `-m`
247247
- `prompt`: The provided `prompt`
248248
- `stopped_eos`: Indicating whether the completion has stopped because it encountered the EOS token

0 commit comments

Comments
 (0)