You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+82-40Lines changed: 82 additions & 40 deletions
Original file line number
Diff line number
Diff line change
@@ -33,7 +33,7 @@ Also, we aimed the lib to be self-contained with the fewest dependencies possibl
33
33
-[Ollama](https://ollama.com/) - runs locally, serves as an umbrella for open-source LLMs including LLaMA3, dbrx, and Command-R
34
34
-[FastChat](https://github.com/lm-sys/FastChat) - runs locally, serves as an umbrella for open-source LLMs such as Vicuna, Alpaca, LLaMA2, and FastChat-T5
35
35
36
-
See [examples](https://github.com/cequence-io/openai-scala-client/tree/master/openai-examples/src/main/scala/io/cequence/openaiscala/examples/nonopenai) for more details.
36
+
See [examples](./openai-examples/src/main/scala/io/cequence/openaiscala/examples/nonopenai) for more details.
37
37
38
38
---
39
39
@@ -176,7 +176,7 @@ Then you can obtain a service in one of the following ways.
@@ -208,9 +208,8 @@ or if only streaming is required
208
208
**II. Calling functions**
209
209
210
210
Full documentation of each call with its respective inputs and settings is provided in [OpenAIService](./openai-core/src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala). Since all the calls are async they return responses wrapped in `Future`.
211
-
🔥 **New**: There is a new project [openai-scala-client-examples](./openai-examples/src/main/scala/io/cequence/openaiscala/examples) where you can find a lot of ready-to-use examples!
212
211
213
-
Examples:
212
+
🔥 **New**: There is a new project [openai-scala-client-examples](./openai-examples/src/main/scala/io/cequence/openaiscala/examples) where you can find a lot of ready-to-use examples!
214
213
215
214
- List models
216
215
@@ -384,50 +383,63 @@ class MyCompletionService extends OpenAICountTokensHelper {
384
383
385
384
---
386
385
387
-
**III. Using multiple services**
386
+
**III. Using adapters**
387
+
388
+
Adapters for OpenAI services (chat completion, core, or full) are provided by [OpenAIServiceAdapters](./openai-core/src/main/scala/io/cequence/openaiscala/service/adapter/OpenAIServiceAdapters). The adapters are used to distribute the load between multiple services, retry on transient errors, route, or provide additional functionality. See [examples](./openai-examples/src/main/scala/io/cequence/openaiscala/examples/adapter) for more details.
389
+
Note that adapters can be arbitrarily combined/stacked.
388
390
389
-
-Load distribution with `OpenAIMultiServiceAdapter` - _round robin_ (_rotation_) type
0 commit comments