-
Notifications
You must be signed in to change notification settings - Fork 38
Adding support to Azure API #8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Changes from 10 commits
Commits
Show all changes
78 commits
Select commit
Hold shift + click to select a range
a32e681
Adding support to Azure API
ccd6961
merge main
ccreutzi 26b1272
parameterize getApiKeyFromNvpOrEnv, allowing different env variables …
ccreutzi 8bd236b
get basic Azure connection working
ccreutzi e009e86
even smaller timeout; failed to throw an error in GitHub once
ccreutzi a6b8d51
add ollamaChat class
ccreutzi 4304636
CI setup for ollama
ccreutzi 61a5152
typos
ccreutzi 9914a55
disable verySmallTimeOutErrors test points, since they are flaky
ccreutzi cd9bbe2
Updated README.md for Azure and Ollama
ccreutzi 8a2ea28
Remove GPT specific penalties from ollamaChat
ccreutzi dde7d95
Implement TopProbabilityNum and StopSequences for ollamaChat
ccreutzi 8d351a2
increase default timeout to 120 seconds for ollamaChat
ccreutzi e229935
add TailFreeSampling_Z, add comment about currently unsupported ollam…
ccreutzi b67f5ef
new static method ollamaChat.models
ccreutzi bcf4d85
update API versions, following https://learn.microsoft.com/en-us/azur…
ccreutzi e51f5eb
typo in help header
ccreutzi 08a8549
add azureChat and ollamaChat to functionSignatures.json
ccreutzi 410a87b
Make StreamFun work with ollamaChat and azureChat
ccreutzi 6b830a2
remove unused defaults, for more realistic coverage numbers
ccreutzi c349a58
Add test that azureChat with Seed fixes result
ccreutzi 0815bdf
try telling codecov to not worry about test files
ccreutzi e8a900d
also ignore errorMessageCatalog.m in codecov, since almost all of it …
ccreutzi a2d65cd
ignore examples/data/* just like data/*
ccreutzi 5851aca
Merge branch 'main' into AzureAPI
ccreutzi 88f054b
remove disabled timeout tests
ccreutzi 5ed246d
Add explanatory comment for missing key test.
ccreutzi 3bd2208
changed wording as requested
ccreutzi beb41a5
simplify comment
ccreutzi 8d50886
spell Ollama with a capital O
ccreutzi 1a5ebb9
Fix capitalization: APIKey, by MathWorks naming standards.
ccreutzi 72dca78
Codecov has problems with uploaded coverage. Try not using plus signs.
ccreutzi 3639020
add unit tests for edge cases and errors in responseStreamer
ccreutzi a2b893a
Add test point for function calls
ccreutzi 0dd0e91
CI setup stores the API key in a different variable
ccreutzi 5d2fd99
for better coverage, run tools test through streaming API
ccreutzi 4ae0248
Function calling on Azure.
ccreutzi 735416b
Throw server errors as errors
ccreutzi 95f2f13
For CI, add `$OPENAI_API_KEY` such that `openAIChat` works
ccreutzi 492fefc
Short error messages for bad endpoints
ccreutzi 3000dc4
Nicer help headers
ccreutzi 0973f11
typos
ccreutzi 238e11a
Rename openAIMessages to messageHistory
ccreutzi 101cb59
Add openAIMessages fallback for backward compatibility
ccreutzi e095802
Modify chatbot example to use Ollama
ccreutzi f680fb5
Merge branch 'main' into AzureAPI
ccreutzi 7e81d37
minimal and complete test for the backward compatibility function
ccreutzi 2611327
Avoid bogus json
ccreutzi 16a3833
Improve ollamaChat tab completion
ccreutzi b33dad5
Remove unused error ID
ccreutzi cdf0971
add test for Ollama chatbot example
ccreutzi 3abfae4
mark trademarks
ccreutzi d577d36
Include ._* in .gitignore
ccreutzi 466460d
Rename `TopProbabilityMass` → `TopP`, `TopProbabilityNum` → `TopK`
ccreutzi 24059e9
Take properties out of `ollamaChat` that do not apply: Tools and API key
ccreutzi 632ac22
Merge branch 'main' into AzureAPI
ccreutzi d5eb25d
Only drop `:latest` from model list
ccreutzi 6108d0f
ollamaChat.models should never return <missing>
ccreutzi 5200776
fix indentations changed by renaming `TopProbabilityMass` to `TopP`
ccreutzi 32547fb
accept char and cellstr input for generate
ccreutzi 4bd315c
split README.md by backend
ccreutzi fa9f06e
`FunctionNames` should only exist for connectors with tools
ccreutzi 8336cbb
Fix link typo
ccreutzi 0543e3e
Fix typo: This is not using OpenAI
ccreutzi 3ebc529
update tests to expect correct errors
ccreutzi 71f88a9
tabs to spaces
ccreutzi 949fe42
`openAIImages` should derive from `needsAPIKey`
ccreutzi 09b8662
test `NumCompletions`
ccreutzi a360f89
Ollama does not support `NumCompletions`
ccreutzi 348fb67
`azureChat` should get endpoint and deployment from env
ccreutzi 4fb866c
clean up comments
ccreutzi e29e65c
move error text to catalogue
ccreutzi b4eb44a
Reorder test points for maintainability
ccreutzi 757f260
Add a streaming example for `ollamaChat`
ccreutzi 56e6aaf
Log Ollama version during CI
ccreutzi f6a9106
Add Seed test to tollamaChat.m
ccreutzi 392749f
Use message from error catalog
ccreutzi 1ac24ff
Disable flaky test points
ccreutzi File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,142 @@ | ||
function [text, message, response] = callAzureChatAPI(endpoint, deploymentID, messages, functions, nvp) | ||
%callAzureChatAPI Calls the openAI chat completions API on Azure. | ||
% | ||
% MESSAGES and FUNCTIONS should be structs matching the json format | ||
% required by the OpenAI Chat Completions API. | ||
% Ref: https://platform.openai.com/docs/guides/gpt/chat-completions-api | ||
% | ||
% Currently, the supported NVP are, including the equivalent name in the API: | ||
% - ToolChoice (tool_choice) | ||
% - Temperature (temperature) | ||
% - TopProbabilityMass (top_p) | ||
% - NumCompletions (n) | ||
% - StopSequences (stop) | ||
% - MaxNumTokens (max_tokens) | ||
% - PresencePenalty (presence_penalty) | ||
% - FrequencyPenalty (frequence_penalty) | ||
% - ResponseFormat (response_format) | ||
% - Seed (seed) | ||
% - ApiKey | ||
% - TimeOut | ||
% - StreamFun | ||
% More details on the parameters: https://platform.openai.com/docs/api-reference/chat/create | ||
% | ||
% Example | ||
% | ||
% % Create messages struct | ||
% messages = {struct("role", "system",... | ||
% "content", "You are a helpful assistant"); | ||
% struct("role", "user", ... | ||
% "content", "What is the edit distance between hi and hello?")}; | ||
% | ||
% % Create functions struct | ||
% functions = {struct("name", "editDistance", ... | ||
% "description", "Find edit distance between two strings or documents.", ... | ||
% "parameters", struct( ... | ||
% "type", "object", ... | ||
% "properties", struct(... | ||
% "str1", struct(... | ||
% "description", "Source string.", ... | ||
% "type", "string"),... | ||
% "str2", struct(... | ||
% "description", "Target string.", ... | ||
% "type", "string")),... | ||
% "required", ["str1", "str2"]))}; | ||
% | ||
% % Define your API key | ||
% apiKey = "your-api-key-here" | ||
% | ||
% % Send a request | ||
% [text, message] = llms.internal.callOpenAIChatAPI(messages, functions, ApiKey=apiKey) | ||
ccreutzi marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
% Copyright 2023-2024 The MathWorks, Inc. | ||
|
||
arguments | ||
endpoint | ||
deploymentID | ||
messages | ||
functions | ||
nvp.ToolChoice = [] | ||
nvp.APIVersion = "2023-05-15" | ||
nvp.Temperature = 1 | ||
nvp.TopProbabilityMass = 1 | ||
nvp.NumCompletions = 1 | ||
nvp.StopSequences = [] | ||
nvp.MaxNumTokens = inf | ||
nvp.PresencePenalty = 0 | ||
nvp.FrequencyPenalty = 0 | ||
nvp.ResponseFormat = "text" | ||
nvp.Seed = [] | ||
nvp.ApiKey = "" | ||
nvp.TimeOut = 10 | ||
nvp.StreamFun = [] | ||
end | ||
|
||
URL = endpoint + "openai/deployments/" + deploymentID + "/chat/completions?api-version=" + nvp.APIVersion; | ||
|
||
parameters = buildParametersCall(messages, functions, nvp); | ||
|
||
[response, streamedText] = llms.internal.sendRequest(parameters,nvp.ApiKey, URL, nvp.TimeOut, nvp.StreamFun); | ||
|
||
% If call errors, "choices" will not be part of response.Body.Data, instead | ||
% we get response.Body.Data.error | ||
if response.StatusCode=="OK" | ||
% Outputs the first generation | ||
if isempty(nvp.StreamFun) | ||
message = response.Body.Data.choices(1).message; | ||
else | ||
message = struct("role", "assistant", ... | ||
"content", streamedText); | ||
end | ||
if isfield(message, "tool_choice") | ||
text = ""; | ||
else | ||
text = string(message.content); | ||
end | ||
else | ||
text = ""; | ||
message = struct(); | ||
end | ||
end | ||
|
||
function parameters = buildParametersCall(messages, functions, nvp) | ||
% Builds a struct in the format that is expected by the API, combining | ||
% MESSAGES, FUNCTIONS and parameters in NVP. | ||
|
||
parameters = struct(); | ||
parameters.messages = messages; | ||
|
||
parameters.stream = ~isempty(nvp.StreamFun); | ||
|
||
if ~isempty(functions) | ||
parameters.tools = functions; | ||
end | ||
|
||
if ~isempty(nvp.ToolChoice) | ||
parameters.tool_choice = nvp.ToolChoice; | ||
end | ||
|
||
if ~isempty(nvp.Seed) | ||
parameters.seed = nvp.Seed; | ||
end | ||
|
||
dict = mapNVPToParameters; | ||
|
||
nvpOptions = keys(dict); | ||
for opt = nvpOptions.' | ||
if isfield(nvp, opt) | ||
parameters.(dict(opt)) = nvp.(opt); | ||
end | ||
end | ||
end | ||
|
||
function dict = mapNVPToParameters() | ||
dict = dictionary(); | ||
dict("Temperature") = "temperature"; | ||
dict("TopProbabilityMass") = "top_p"; | ||
dict("NumCompletions") = "n"; | ||
dict("StopSequences") = "stop"; | ||
dict("MaxNumTokens") = "max_tokens"; | ||
dict("PresencePenalty") = "presence_penalty"; | ||
dict("FrequencyPenalty") = "frequency_penalty"; | ||
end |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,127 @@ | ||
function [text, message, response] = callOllamaChatAPI(model, messages, nvp) | ||
%callOllamaChatAPI Calls the ollama chat completions API. | ||
ccreutzi marked this conversation as resolved.
Show resolved
Hide resolved
|
||
% | ||
% MESSAGES and FUNCTIONS should be structs matching the json format | ||
% required by the ollama Chat Completions API. | ||
% Ref: https://github.com/ollama/ollama/blob/main/docs/api.md | ||
% | ||
% Currently, the supported NVP are, including the equivalent name in the API: | ||
% TODO TODO TODO | ||
ccreutzi marked this conversation as resolved.
Show resolved
Hide resolved
|
||
% - Temperature (temperature) | ||
% - TopProbabilityMass (top_p) | ||
% - NumCompletions (n) | ||
% - StopSequences (stop) | ||
% - MaxNumTokens (max_tokens) | ||
% - PresencePenalty (presence_penalty) | ||
% - FrequencyPenalty (frequence_penalty) | ||
% - ResponseFormat (response_format) | ||
% - Seed (seed) | ||
% - ApiKey | ||
% - TimeOut | ||
% - StreamFun | ||
% More details on the parameters: https://platform.openai.com/docs/api-reference/chat/create | ||
% | ||
% Example | ||
% | ||
% % Create messages struct | ||
% messages = {struct("role", "system",... | ||
% "content", "You are a helpful assistant"); | ||
% struct("role", "user", ... | ||
% "content", "What is the edit distance between hi and hello?")}; | ||
% | ||
% % Create functions struct | ||
% functions = {struct("name", "editDistance", ... | ||
% "description", "Find edit distance between two strings or documents.", ... | ||
% "parameters", struct( ... | ||
% "type", "object", ... | ||
% "properties", struct(... | ||
% "str1", struct(... | ||
% "description", "Source string.", ... | ||
% "type", "string"),... | ||
% "str2", struct(... | ||
% "description", "Target string.", ... | ||
% "type", "string")),... | ||
% "required", ["str1", "str2"]))}; | ||
% | ||
% % Define your API key | ||
% apiKey = "your-api-key-here" | ||
% | ||
% % Send a request | ||
% [text, message] = llms.internal.callOpenAIChatAPI(messages, functions, ApiKey=apiKey) | ||
|
||
% Copyright 2023-2024 The MathWorks, Inc. | ||
|
||
arguments | ||
model | ||
messages | ||
nvp.Temperature = 1 | ||
nvp.TopProbabilityMass = 1 | ||
nvp.NumCompletions = 1 | ||
nvp.StopSequences = [] | ||
nvp.MaxNumTokens = inf | ||
nvp.PresencePenalty = 0 | ||
nvp.FrequencyPenalty = 0 | ||
nvp.ResponseFormat = "text" | ||
nvp.Seed = [] | ||
nvp.TimeOut = 10 | ||
nvp.StreamFun = [] | ||
end | ||
|
||
URL = "http://localhost:11434/api/chat"; % TODO: model parameter | ||
ccreutzi marked this conversation as resolved.
Show resolved
Hide resolved
ccreutzi marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
parameters = buildParametersCall(model, messages, nvp); | ||
|
||
[response, streamedText] = llms.internal.sendRequest(parameters,[],URL,nvp.TimeOut,nvp.StreamFun); | ||
|
||
% If call errors, "choices" will not be part of response.Body.Data, instead | ||
% we get response.Body.Data.error | ||
if response.StatusCode=="OK" | ||
% Outputs the first generation | ||
if isempty(nvp.StreamFun) | ||
message = response.Body.Data.message; | ||
else | ||
message = struct("role", "assistant", ... | ||
"content", streamedText); | ||
end | ||
text = string(message.content); | ||
else | ||
text = ""; | ||
message = struct(); | ||
end | ||
end | ||
|
||
function parameters = buildParametersCall(model, messages, nvp) | ||
% Builds a struct in the format that is expected by the API, combining | ||
% MESSAGES, FUNCTIONS and parameters in NVP. | ||
|
||
parameters = struct(); | ||
parameters.model = model; | ||
parameters.messages = messages; | ||
|
||
parameters.stream = ~isempty(nvp.StreamFun); | ||
|
||
options = struct; | ||
if ~isempty(nvp.Seed) | ||
options.seed = nvp.Seed; | ||
end | ||
|
||
dict = mapNVPToParameters; | ||
|
||
nvpOptions = keys(dict); | ||
for opt = nvpOptions.' | ||
if isfield(nvp, opt) | ||
options.(dict(opt)) = nvp.(opt); | ||
end | ||
end | ||
|
||
parameters.options = options; | ||
end | ||
|
||
function dict = mapNVPToParameters() | ||
dict = dictionary(); | ||
dict("Temperature") = "temperature"; | ||
dict("TopProbabilityMass") = "top_p"; | ||
dict("NumCompletions") = "n"; | ||
dict("StopSequences") = "stop"; | ||
dict("MaxNumTokens") = "num_predict"; | ||
end |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,23 +1,23 @@ | ||
function key = getApiKeyFromNvpOrEnv(nvp) | ||
function key = getApiKeyFromNvpOrEnv(nvp,envVarName) | ||
ccreutzi marked this conversation as resolved.
Show resolved
Hide resolved
|
||
% This function is undocumented and will change in a future release | ||
|
||
%getApiKeyFromNvpOrEnv Retrieves an API key from a Name-Value Pair struct or environment variable. | ||
% | ||
% This function takes a struct nvp containing name-value pairs and checks | ||
% if it contains a field called "ApiKey". If the field is not found, | ||
% the function attempts to retrieve the API key from an environment | ||
% variable called "OPENAI_API_KEY". If both methods fail, the function | ||
% throws an error. | ||
% This function takes a struct nvp containing name-value pairs and checks if | ||
% it contains a field called "ApiKey". If the field is not found, the | ||
% function attempts to retrieve the API key from an environment variable | ||
% whose name is given as the second argument. If both methods fail, the | ||
% function throws an error. | ||
|
||
% Copyright 2023 The MathWorks, Inc. | ||
% Copyright 2023-2024 The MathWorks, Inc. | ||
|
||
if isfield(nvp, "ApiKey") | ||
key = nvp.ApiKey; | ||
else | ||
if isenv("OPENAI_API_KEY") | ||
key = getenv("OPENAI_API_KEY"); | ||
if isenv(envVarName) | ||
key = getenv(envVarName); | ||
else | ||
error("llms:keyMustBeSpecified", llms.utils.errorMessageCatalog.getMessage("llms:keyMustBeSpecified")); | ||
error("llms:keyMustBeSpecified", llms.utils.errorMessageCatalog.getMessage("llms:keyMustBeSpecified", envVarName)); | ||
end | ||
end | ||
end | ||
end |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,43 @@ | ||
classdef (Abstract) textGenerator | ||
% | ||
|
||
% Copyright 2023-2024 The MathWorks, Inc. | ||
|
||
properties | ||
%TEMPERATURE Temperature of generation. | ||
Temperature {llms.utils.mustBeValidTemperature} = 1 | ||
|
||
%TOPPROBABILITYMASS Top probability mass to consider for generation. | ||
TopProbabilityMass {llms.utils.mustBeValidTopP} = 1 | ||
|
||
%STOPSEQUENCES Sequences to stop the generation of tokens. | ||
StopSequences {llms.utils.mustBeValidStop} = {} | ||
|
||
%PRESENCEPENALTY Penalty for using a token in the response that has already been used. | ||
PresencePenalty {llms.utils.mustBeValidPenalty} = 0 | ||
|
||
%FREQUENCYPENALTY Penalty for using a token that is frequent in the training data. | ||
FrequencyPenalty {llms.utils.mustBeValidPenalty} = 0 | ||
end | ||
|
||
properties (SetAccess=protected) | ||
%TIMEOUT Connection timeout in seconds (default 10 secs) | ||
TimeOut | ||
|
||
%FUNCTIONNAMES Names of the functions that the model can request calls | ||
FunctionNames | ||
|
||
%SYSTEMPROMPT System prompt. | ||
SystemPrompt = [] | ||
|
||
%RESPONSEFORMAT Response format, "text" or "json" | ||
ResponseFormat | ||
end | ||
|
||
properties (Access=protected) | ||
Tools | ||
FunctionsStruct | ||
ApiKey | ||
StreamFun | ||
end | ||
end |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.