Skip to content

Commit cf8cc85

Browse files
authored
server : Fixed wrong function name in llamacpp server unit test (#11473)
The test_completion_stream_with_openai_library() function is actually with stream=False by default, and test_completion_with_openai_library() with stream=True
1 parent d0c0804 commit cf8cc85

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

examples/server/tests/unit/test_completion.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -87,7 +87,7 @@ def test_completion_stream_vs_non_stream():
8787
assert content_stream == res_non_stream.body["content"]
8888

8989

90-
def test_completion_stream_with_openai_library():
90+
def test_completion_with_openai_library():
9191
global server
9292
server.start()
9393
client = OpenAI(api_key="dummy", base_url=f"http://{server.server_host}:{server.server_port}/v1")
@@ -102,7 +102,7 @@ def test_completion_stream_with_openai_library():
102102
assert match_regex("(going|bed)+", res.choices[0].text)
103103

104104

105-
def test_completion_with_openai_library():
105+
def test_completion_stream_with_openai_library():
106106
global server
107107
server.start()
108108
client = OpenAI(api_key="dummy", base_url=f"http://{server.server_host}:{server.server_port}/v1")

0 commit comments

Comments
 (0)