File tree Expand file tree Collapse file tree 2 files changed +8
-0
lines changed Expand file tree Collapse file tree 2 files changed +8
-0
lines changed Original file line number Diff line number Diff line change @@ -7,6 +7,10 @@ AI Server allows you to orchestrate your systems AI requests through a single se
7
7
8
8
[ ![ ] ( /img/pages/ai-server/overview.svg )] ( https://openai.servicestack.net )
9
9
10
+ :::youtube Ojo80oFQte8
11
+ Self Hosted AI Server gateway for LLM APIs, Ollama, ComfyUI & FFmpeg servers
12
+ :::
13
+
10
14
## Why Use AI Server?
11
15
12
16
AI Server simplifies the integration and management of AI capabilities in your applications:
Original file line number Diff line number Diff line change @@ -4,6 +4,10 @@ title: Self-hosted AI Providers with Ollama
4
4
5
5
Ollama can be used as an AI Provider type to process LLM requests in AI Server.
6
6
7
+ :::youtube S1Xw0iQLa2c
8
+ Using Ollama with AI Server
9
+ :::
10
+
7
11
## Setting up Ollama
8
12
9
13
When using Ollama as an AI Provider, you will need to ensure the models you want to use are available in your Ollama instance.
You can’t perform that action at this time.
0 commit comments