Skip to content

Commit cd9bbe2

Browse files
committed
Updated README.md for Azure and Ollama
1 parent 9914a55 commit cd9bbe2

File tree

1 file changed

+83
-21
lines changed

1 file changed

+83
-21
lines changed

README.md

Lines changed: 83 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,13 @@
22

33
[![Open in MATLAB Online](https://www.mathworks.com/images/responsive/global/open-in-matlab-online.svg)](https://matlab.mathworks.com/open/github/v1?repo=matlab-deep-learning/llms-with-matlab) [![View Large Language Models (LLMs) with MATLAB on File Exchange](https://www.mathworks.com/matlabcentral/images/matlab-file-exchange.svg)](https://www.mathworks.com/matlabcentral/fileexchange/163796-large-language-models-llms-with-matlab)
44

5-
This repository contains example code to demonstrate how to connect MATLAB to the OpenAI™ Chat Completions API (which powers ChatGPT™) as well as OpenAI Images API (which powers DALL·E™). This allows you to leverage the natural language processing capabilities of large language models directly within your MATLAB environment.
5+
This repository contains example code to demonstrate how to connect MATLAB to the OpenAI™ Chat Completions API (which powers ChatGPT™), OpenAI Images API (which powers DALL·E™), [Azure® OpenAI Service](https://learn.microsoft.com/en-us/azure/ai-services/openai/), and local [Ollama](https://ollama.com/) models. This allows you to leverage the natural language processing capabilities of large language models directly within your MATLAB environment.
66

7-
The functionality shown here serves as an interface to the ChatGPT and DALL·E APIs. To start using the OpenAI APIs, you first need to obtain OpenAI API keys. You are responsible for any fees OpenAI may charge for the use of their APIs. You should be familiar with the limitations and risks associated with using this technology, and you agree that you shall be solely responsible for full compliance with any terms that may apply to your use of the OpenAI APIs.
7+
## OpenAI and Azure
88

9-
Some of the current LLMs supported are:
9+
The functionality shown here serves as an interface to the APIs listed above. To start using the OpenAI APIs, you first need to obtain OpenAI API keys; to use Azure OpenAI Services, you need to create a model deployment on your Azure account and obtain one of the keys for it. You are responsible for any fees OpenAI or Azure may charge for the use of their APIs. You should be familiar with the limitations and risks associated with using this technology, and you agree that you shall be solely responsible for full compliance with any terms that may apply to your use of the OpenAI or Azure APIs.
10+
11+
Some of the current LLMs supported on Azure and OpenAI are:
1012
- gpt-3.5-turbo, gpt-3.5-turbo-1106, gpt-3.5-turbo-0125
1113
- gpt-4o, gpt-4o-2024-05-13 (GPT-4 Omni)
1214
- gpt-4-turbo, gpt-4-turbo-2024-04-09 (GPT-4 Turbo with Vision)
@@ -15,6 +17,19 @@ Some of the current LLMs supported are:
1517

1618
For details on the specification of each model, check the official [OpenAI documentation](https://platform.openai.com/docs/models).
1719

20+
## Ollama
21+
22+
To use local models with [Ollama](https://ollama.com/), you will need to install and start an Ollama server, and “pull” models into it. Please follow the Ollama documentation for details. You should be familiar with the limitations and risks associated with using this technology, and you agree that you shall be solely responsible for full compliance with any terms that may apply to your use of any specific model.
23+
24+
Some of the [LLMs currently supported out of the box on Ollama](https://ollama.com/library) are:
25+
- llama2, llama2-uncensored, llama3, codellama
26+
- phi3
27+
- aya
28+
- mistral (v0.1, v0.2, v0.3)
29+
- mixtral
30+
- gemma, codegemma
31+
- command-r
32+
1833
## Requirements
1934

2035
### MathWorks Products (https://www.mathworks.com)
@@ -24,7 +39,9 @@ For details on the specification of each model, check the official [OpenAI docum
2439

2540
### 3rd Party Products:
2641

27-
- An active OpenAI API subscription and API key.
42+
- For OpenAI connections: An active OpenAI API subscription and API key.
43+
- For Azure OpenAI Services: An active Azure subscription with OpenAI access, deployment, and API key.
44+
- For Ollama: A local ollama installation. Currently, only connections on `localhost` are supported, i.e., Ollama and MATLAB must run on the same machine.
2845

2946
## Setup
3047

@@ -51,14 +68,30 @@ To use this repository with a local installation of MATLAB, first clone the repo
5168
addpath('path/to/llms-with-matlab');
5269
```
5370

54-
### Setting up your API key
71+
### Setting up your OpenAI API key
5572

5673
Set up your OpenAI API key. Create a `.env` file in the project root directory with the following content.
5774

5875
```
5976
OPENAI_API_KEY=<your key>
6077
```
61-
78+
79+
Then load your `.env` file as follows:
80+
81+
```matlab
82+
loadenv(".env")
83+
```
84+
85+
### Setting up your Azure OpenAI Services API key
86+
87+
Set up your OpenAI API key. Create a `.env` file in the project root directory with the following content.
88+
89+
```
90+
AZURE_OPENAI_API_KEY=<your key>
91+
```
92+
93+
You can use either `KEY1` or `KEY2` from the Azure configuration website.
94+
6295
Then load your `.env` file as follows:
6396

6497
```matlab
@@ -67,7 +100,7 @@ loadenv(".env")
67100

68101
## Getting Started with Chat Completion API
69102

70-
To get started, you can either create an `openAIChat` object and use its methods or use it in a more complex setup, as needed.
103+
To get started, you can either create an `openAIChat`, `azureChat`, or `ollamaChat` object and use its methods or use it in a more complex setup, as needed.
71104

72105
### Simple call without preserving chat history
73106

@@ -81,13 +114,14 @@ Here's a simple example of how to use the `openAIChat` for sentiment analysis:
81114
% The system prompt tells the assistant how to behave, in this case, as a sentiment analyzer
82115
systemPrompt = "You are a sentiment analyser. You will look at a sentence and output"+...
83116
" a single word that classifies that sentence as either 'positive' or 'negative'."+....
84-
"Examples: \n"+...
85-
"The project was a complete failure. \n"+...
86-
"negative \n\n"+...
87-
"The team successfully completed the project ahead of schedule."+...
88-
"positive \n\n"+...
89-
"His attitude was terribly discouraging to the team. \n"+...
90-
"negative \n\n";
117+
newline + ...
118+
"Examples:" + newline +...
119+
"The project was a complete failure." + newline +...
120+
"negative" + newline + newline +...
121+
"The team successfully completed the project ahead of schedule." + newline +...
122+
"positive" + newline + newline +...
123+
"His attitude was terribly discouraging to the team." + newline +...
124+
"negative" + newline + newline;
91125
92126
chat = openAIChat(systemPrompt);
93127
@@ -112,14 +146,16 @@ Then create the chat assistant:
112146
chat = openAIChat("You are a helpful AI assistant.");
113147
```
114148

115-
Add a user message to the history and pass it to `generate`
149+
(Side note: `azureChat` and `ollamaChat` work with `openAIMessages`, too.)
150+
151+
Add a user message to the history and pass it to `generate`:
116152

117153
```matlab
118154
history = addUserMessage(history,"What is an eigenvalue?");
119155
[txt, response] = generate(chat, history)
120156
```
121157

122-
The output `txt` will contain the answer and `response` will contain the full response, which you need to include in the history as follows
158+
The output `txt` will contain the answer and `response` will contain the full response, which you need to include in the history as follows:
123159
```matlab
124160
history = addResponseMessage(history, response);
125161
```
@@ -144,6 +180,8 @@ txt = generate(chat,"What is Model-Based Design and how is it related to Digital
144180

145181
### Calling MATLAB functions with the API
146182

183+
(This is currently not supported for `ollamaChat`.)
184+
147185
Optionally, `Tools=functions` can be used to provide function specifications to the API. The purpose of this is to enable models to generate function arguments which adhere to the provided specifications.
148186
Note that the API is not able to directly call any function, so you should call the function and pass the values to the API directly. This process can be automated as shown in [AnalyzeScientificPapersUsingFunctionCalls.mlx](/examples/AnalyzeScientificPapersUsingFunctionCalls.mlx), but it's important to consider that ChatGPT can hallucinate function names, so avoid executing any arbitrary generated functions and only allow the execution of functions that you have defined.
149187

@@ -300,14 +338,13 @@ messages = addUserMessageWithImages(messages,"What is in the image?",image_path)
300338
% Should output the description of the image
301339
```
302340

303-
## Establishing a connection to Chat Completions API using Azure®
341+
## Establishing a connection to Chat Completions API using Azure
304342

305-
If you would like to connect MATLAB to Chat Completions API via Azure® instead of directly with OpenAI, you will have to create an `azureChat` object.
306-
However, you first need to obtain, in addition to the Azure API keys, your Azure OpenAI Resource.
343+
If you would like to connect MATLAB to Chat Completions API via Azure instead of directly with OpenAI, you will have to create an `azureChat` object. See [the Azure documentation](https://learn.microsoft.com/en-us/azure/ai-services/openai/chatgpt-quickstart) for details on the setup required and where to find your key, endpoint, and deployment name. As explained above, the key should be in the environment variable `AZURE_OPENAI_API_KEY`, or provided as `ApiKey=…` in the `azureChat` call below.
307344

308345
In order to create the chat assistant, you must specify your Azure OpenAI Resource and the LLM you want to use:
309346
```matlab
310-
chat = azureChat(YOUR_RESOURCE_NAME, YOUR_DEPLOYMENT_NAME, "You are a helpful AI assistant");
347+
chat = azureChat(YOUR_ENDPOINT_NAME, YOUR_DEPLOYMENT_NAME, "You are a helpful AI assistant");
311348
```
312349

313350
The `azureChat` object also allows to specify additional options in the same way as the `openAIChat` object.
@@ -329,7 +366,32 @@ history = addUserMessage(history,"What is an eigenvalue?");
329366
history = addResponseMessage(history, response);
330367
```
331368

332-
### Obtaining embeddings
369+
## Establishing a connection to local LLMs using Ollama
370+
371+
In case you want to use a local LLM (e.g., to avoid sending sensitive data to a cloud provider, or to use other models), you will need to install Ollama and pull a model, following the instructions on [ollama.com](https://ollama.com). Ollama needs to run on the same machine as your MATLAB instance.
372+
373+
In order to create the chat assistant, you must specify the LLM you want to use:
374+
```matlab
375+
chat = ollamaChat("mistral");
376+
```
377+
378+
The additional options of `ollamaChat` are similar to those of `openAIChat` and `azureChat`.
379+
380+
In many workflows, `ollamaChat` is drop-in compatible with `openAIChat`:
381+
```matlab
382+
% Initialize the chat object
383+
chat = ollamaChat("phi3");
384+
385+
% Create an openAIMessages object to start the conversation history
386+
history = openAIMessages;
387+
388+
% Ask your question and store it in the history, create the response using the generate method, and store the response in the history
389+
history = addUserMessage(history,"What is an eigenvalue?");
390+
[txt, response] = generate(chat, history)
391+
history = addResponseMessage(history, response);
392+
```
393+
394+
## Obtaining embeddings
333395

334396
You can extract embeddings from your text with OpenAI using the function `extractOpenAIEmbeddings` as follows:
335397
```matlab

0 commit comments

Comments
 (0)