Skip to content

Commit bab4ad8

Browse files
committed
server: tests: add new tokens regex following new repeat penalties default changed in (#6127)
1 parent df03b2d commit bab4ad8

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

examples/server/tests/features/server.feature

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -35,9 +35,9 @@ Feature: llama.cpp server
3535
And metric llamacpp:tokens_predicted is <n_predicted>
3636

3737
Examples: Prompts
38-
| prompt | n_predict | re_content | n_prompt | n_predicted | truncated |
39-
| I believe the meaning of life is | 8 | (read\|going)+ | 18 | 8 | not |
40-
| Write a joke about AI from a very long prompt which will not be truncated | 256 | (princesses\|everyone\|kids\|Anna)+ | 46 | 64 | not |
38+
| prompt | n_predict | re_content | n_prompt | n_predicted | truncated |
39+
| I believe the meaning of life is | 8 | (read\|going)+ | 18 | 8 | not |
40+
| Write a joke about AI from a very long prompt which will not be truncated | 256 | (princesses\|everyone\|kids\|Anna\|forest)+ | 46 | 64 | not |
4141

4242
Scenario: Completion prompt truncated
4343
Given a prompt:
@@ -65,9 +65,9 @@ Feature: llama.cpp server
6565
And the completion is <truncated> truncated
6666

6767
Examples: Prompts
68-
| model | system_prompt | user_prompt | max_tokens | re_content | n_prompt | n_predicted | enable_streaming | truncated |
69-
| llama-2 | Book | What is the best book | 8 | (Here\|what)+ | 77 | 8 | disabled | not |
70-
| codellama70b | You are a coding assistant. | Write the fibonacci function in c++. | 128 | (thanks\|happy\|bird)+ | -1 | 64 | enabled | |
68+
| model | system_prompt | user_prompt | max_tokens | re_content | n_prompt | n_predicted | enable_streaming | truncated |
69+
| llama-2 | Book | What is the best book | 8 | (Here\|what)+ | 77 | 8 | disabled | not |
70+
| codellama70b | You are a coding assistant. | Write the fibonacci function in c++. | 128 | (thanks\|happy\|bird\|Annabyear)+ | -1 | 64 | enabled | |
7171

7272

7373
Scenario: Tokenize / Detokenize

0 commit comments

Comments
 (0)