Skip to content

Commit fc1798c

Browse files
committed
Disable yet another flaky Ollama test point
It is unclear at this time why this test point is unreliable, but it just started failing in the GitHub CI, possibly following some Ollama update. We are not explicitly promising this behaviour, and the change was not on our side.
1 parent c454452 commit fc1798c

File tree

1 file changed

+5
-0
lines changed

1 file changed

+5
-0
lines changed

tests/tollamaChat.m

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -47,6 +47,11 @@ function doGenerateUsingSystemPrompt(testCase)
4747
end
4848

4949
function extremeTopK(testCase)
50+
%% This should work, and it does on some computers. On others, Ollama
51+
%% receives the parameter, but either Ollama or llama.cpp fails to
52+
%% honor it correctly.
53+
testCase.assumeTrue(false,"disabled due to Ollama/llama.cpp not honoring parameter reliably");
54+
5055
% setting top-k to k=1 leaves no random choice,
5156
% so we expect to get a fixed response.
5257
chat = ollamaChat("mistral",TopK=1);

0 commit comments

Comments
 (0)