Skip to content

Commit a710b2e

Browse files
committed
README - renaming API-based -> cloud-based
1 parent 1dcd7ae commit a710b2e

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

README.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -22,12 +22,12 @@ Note that in order to be consistent with the OpenAI API naming, the service func
2222
Also, we aimed the lib to be self-contained with the fewest dependencies possible therefore we ended up using only two libs `play-ahc-ws-standalone` and `play-ws-standalone-json` (at the top level). Additionally, if dependency injection is required we use `scala-guice` lib as well.
2323

2424
(🔥 **New**) In addition to the OpenAI API, this library also supports "API-compatible" providers such as:
25-
- [Azure OpenAI](https://azure.microsoft.com/en-us/products/ai-services/openai-service) - API-based, utilizes OpenAI models but with lower latency
26-
- [Azure AI](https://azure.microsoft.com/en-us/products/ai-studio) - API-based, offers a vast selection of open-source models
27-
- [Anthropic](https://www.anthropic.com/api) - API-based, a major competitor to OpenAI, features proprietary/closed-source models such as Claude3 - Haiku, Sonnet, and Opus
28-
- [Groq](https://wow.groq.com/) - API-based, known for its super-fast inference with LPUs
29-
- [Fireworks](https://fireworks.ai/) - API-based
30-
- [OctoAI](https://octo.ai/) - API-based
25+
- [Azure OpenAI](https://azure.microsoft.com/en-us/products/ai-services/openai-service) - cloud-based, utilizes OpenAI models but with lower latency
26+
- [Azure AI](https://azure.microsoft.com/en-us/products/ai-studio) - cloud-based, offers a vast selection of open-source models
27+
- [Anthropic](https://www.anthropic.com/api) - cloud-based, a major competitor to OpenAI, features proprietary/closed-source models such as Claude3 - Haiku, Sonnet, and Opus
28+
- [Groq](https://wow.groq.com/) - cloud-based, known for its super-fast inference with LPUs
29+
- [Fireworks](https://fireworks.ai/) - cloud-based
30+
- [OctoAI](https://octo.ai/) - cloud-based
3131
- [Ollama](https://ollama.com/) - runs locally, serves as an umbrella for open-source LLMs including LLaMA3, dbrx, and Command-R
3232
- [FastChat](https://github.com/lm-sys/FastChat) - runs locally, serves as an umbrella for open-source LLMs such as Vicuna, Alpaca, LLaMA2, and FastChat-T5
3333

0 commit comments

Comments
 (0)