You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+6-6Lines changed: 6 additions & 6 deletions
Original file line number
Diff line number
Diff line change
@@ -22,12 +22,12 @@ Note that in order to be consistent with the OpenAI API naming, the service func
22
22
Also, we aimed the lib to be self-contained with the fewest dependencies possible therefore we ended up using only two libs `play-ahc-ws-standalone` and `play-ws-standalone-json` (at the top level). Additionally, if dependency injection is required we use `scala-guice` lib as well.
23
23
24
24
(🔥 **New**) In addition to the OpenAI API, this library also supports "API-compatible" providers such as:
25
-
-[Azure OpenAI](https://azure.microsoft.com/en-us/products/ai-services/openai-service) - API-based, utilizes OpenAI models but with lower latency
26
-
-[Azure AI](https://azure.microsoft.com/en-us/products/ai-studio) - API-based, offers a vast selection of open-source models
27
-
-[Anthropic](https://www.anthropic.com/api) - API-based, a major competitor to OpenAI, features proprietary/closed-source models such as Claude3 - Haiku, Sonnet, and Opus
28
-
-[Groq](https://wow.groq.com/) - API-based, known for its super-fast inference with LPUs
29
-
-[Fireworks](https://fireworks.ai/) - API-based
30
-
-[OctoAI](https://octo.ai/) - API-based
25
+
-[Azure OpenAI](https://azure.microsoft.com/en-us/products/ai-services/openai-service) - cloud-based, utilizes OpenAI models but with lower latency
26
+
-[Azure AI](https://azure.microsoft.com/en-us/products/ai-studio) - cloud-based, offers a vast selection of open-source models
27
+
-[Anthropic](https://www.anthropic.com/api) - cloud-based, a major competitor to OpenAI, features proprietary/closed-source models such as Claude3 - Haiku, Sonnet, and Opus
28
+
-[Groq](https://wow.groq.com/) - cloud-based, known for its super-fast inference with LPUs
29
+
-[Fireworks](https://fireworks.ai/) - cloud-based
30
+
-[OctoAI](https://octo.ai/) - cloud-based
31
31
-[Ollama](https://ollama.com/) - runs locally, serves as an umbrella for open-source LLMs including LLaMA3, dbrx, and Command-R
32
32
-[FastChat](https://github.com/lm-sys/FastChat) - runs locally, serves as an umbrella for open-source LLMs such as Vicuna, Alpaca, LLaMA2, and FastChat-T5
0 commit comments