Skip to content

Commit 8d50886

Browse files
committed
spell Ollama with a capital O
1 parent beb41a5 commit 8d50886

File tree

3 files changed

+6
-20
lines changed

3 files changed

+6
-20
lines changed

+llms/+internal/callOllamaChatAPI.m

Lines changed: 2 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -1,26 +1,12 @@
11
function [text, message, response] = callOllamaChatAPI(model, messages, nvp)
22
% This function is undocumented and will change in a future release
33

4-
%callOllamaChatAPI Calls the ollama chat completions API.
4+
%callOllamaChatAPI Calls the Ollama chat completions API.
55
%
66
% MESSAGES and FUNCTIONS should be structs matching the json format
7-
% required by the ollama Chat Completions API.
7+
% required by the Ollama Chat Completions API.
88
% Ref: https://github.com/ollama/ollama/blob/main/docs/api.md
99
%
10-
% Currently, the supported NVP are, including the equivalent name in the API:
11-
% TODO TODO TODO
12-
% - Temperature (temperature)
13-
% - TopProbabilityMass (top_p)
14-
% - NumCompletions (n)
15-
% - StopSequences (stop)
16-
% - MaxNumTokens (max_tokens)
17-
% - PresencePenalty (presence_penalty)
18-
% - FrequencyPenalty (frequence_penalty)
19-
% - ResponseFormat (response_format)
20-
% - Seed (seed)
21-
% - ApiKey
22-
% - TimeOut
23-
% - StreamFun
2410
% More details on the parameters: https://github.com/ollama/ollama/blob/main/docs/modelfile.md#valid-parameters-and-values
2511
%
2612
% Example

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@ Some of the [LLMs currently supported out of the box on Ollama](https://ollama.c
4141

4242
- For OpenAI connections: An active OpenAI API subscription and API key.
4343
- For Azure OpenAI Services: An active Azure subscription with OpenAI access, deployment, and API key.
44-
- For Ollama: A local ollama installation. Currently, only connections on `localhost` are supported, i.e., Ollama and MATLAB must run on the same machine.
44+
- For Ollama: A local Ollama installation. Currently, only connections on `localhost` are supported, i.e., Ollama and MATLAB must run on the same machine.
4545

4646
## Setup
4747

ollamaChat.m

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@
4646
% generate - Generate a response using the ollamaChat instance.
4747
%
4848
% ollamaChat Properties, in addition to the name-value pairs above:
49-
% Model - Model name (as expected by ollama server)
49+
% Model - Model name (as expected by Ollama server)
5050
%
5151
% SystemPrompt - System prompt.
5252

@@ -154,9 +154,9 @@
154154

155155
methods(Static)
156156
function mdls = models
157-
%ollamaChat.models - return models available on ollama server
157+
%ollamaChat.models - return models available on Ollama server
158158
% MDLS = ollamaChat.models returns a string vector MDLS
159-
% listing the models available on the local ollama server.
159+
% listing the models available on the local Ollama server.
160160
%
161161
% These names can be used in the ollamaChat constructor.
162162
% For names with a colon, such as "phi:latest", it is

0 commit comments

Comments
 (0)