Skip to content

Commit b29b780

Browse files
committed
README - typos fixed
1 parent 23aa3eb commit b29b780

File tree

1 file changed

+9
-8
lines changed

1 file changed

+9
-8
lines changed

README.md

Lines changed: 9 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
# OpenAI Scala Client 🤖
22
[![version](https://img.shields.io/badge/version-1.0.0.RC.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) ![GitHub Stars](https://img.shields.io/github/stars/cequence-io/openai-scala-client?style=social) [![Twitter Follow](https://img.shields.io/twitter/follow/0xbnd?style=social)](https://twitter.com/0xbnd) ![GitHub CI](https://github.com/cequence-io/openai-scala-client/actions/workflows/continuous-integration.yml/badge.svg)
33

4-
This is a no-nonsense async Scala client for OpenAI API supporting all the available endpoints and params **including streaming**, the newest **ChatGPT completion**, **vision**, and **voice routines** (as defined [here](https://beta.openai.com/docs/api-reference)), provided in a single, convenient service called [OpenAIService](./openai-core/src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala). The supported calls are:
4+
This is a no-nonsense async Scala client for OpenAI API supporting all the available endpoints and params **including streaming**, the newest **chat completion**, **vision**, and **voice routines** (as defined [here](https://beta.openai.com/docs/api-reference)), provided in a single, convenient service called [OpenAIService](./openai-core/src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala). The supported calls are:
55

66
* **Models**: [listModels](https://platform.openai.com/docs/api-reference/models/list), and [retrieveModel](https://platform.openai.com/docs/api-reference/models/retrieve)
77
* **Completions**: [createCompletion](https://platform.openai.com/docs/api-reference/completions/create)
@@ -23,31 +23,31 @@ Also, we aimed the lib to be self-contained with the fewest dependencies possibl
2323

2424
---
2525

26-
(🔥 **New**) In addition to the OpenAI API, this library also supports "API-compatible" providers such as:
26+
(🔥 **New**) In addition to the OpenAI API, this library also supports API-compatible providers such as:
2727
- [Azure OpenAI](https://azure.microsoft.com/en-us/products/ai-services/openai-service) - cloud-based, utilizes OpenAI models but with lower latency
2828
- [Azure AI](https://azure.microsoft.com/en-us/products/ai-studio) - cloud-based, offers a vast selection of open-source models
2929
- [Anthropic](https://www.anthropic.com/api) - cloud-based, a major competitor to OpenAI, features proprietary/closed-source models such as Claude3 - Haiku, Sonnet, and Opus
3030
- [Groq](https://wow.groq.com/) - cloud-based, known for its super-fast inference with LPUs
3131
- [Fireworks AI](https://fireworks.ai/) - cloud-based
3232
- [OctoAI](https://octo.ai/) - cloud-based
3333
- [Ollama](https://ollama.com/) - runs locally, serves as an umbrella for open-source LLMs including LLaMA3, dbrx, and Command-R
34-
- [FastChat](https://github.com/lm-sys/FastChat) - runs locally, serves as an umbrella for open-source LLMs such as Vicuna, Alpaca, LLaMA2, and FastChat-T5
34+
- [FastChat](https://github.com/lm-sys/FastChat) - runs locally, serves as an umbrella for open-source LLMs such as Vicuna, Alpaca, and FastChat-T5
3535

3636
See [examples](./openai-examples/src/main/scala/io/cequence/openaiscala/examples/nonopenai) for more details.
3737

3838
---
3939

4040
👉 For background information read an article about the lib/client on [Medium](https://medium.com/@0xbnd/openai-scala-client-is-out-d7577de934ad).
4141

42-
Try out also our [Scala client for Pinecone vector database](https://github.com/cequence-io/pinecone-scala), or use both clients together! [This demo project](https://github.com/cequence-io/pinecone-openai-scala-demo) shows how to generate and store OpenAI embeddings (with `text-embedding-ada-002` model) into Pinecone and query them afterward. The OpenAI + Pinecone combo is commonly used for autonomous AI agents, such as [babyAGI](https://github.com/yoheinakajima/babyagi) and [AutoGPT](https://github.com/Significant-Gravitas/Auto-GPT).
42+
Also try out our [Scala client for Pinecone vector database](https://github.com/cequence-io/pinecone-scala), or use both clients together! [This demo project](https://github.com/cequence-io/pinecone-openai-scala-demo) shows how to generate and store OpenAI embeddings (with `text-embedding-ada-002` model) into Pinecone and query them afterward. The OpenAI + Pinecone combo is commonly used for autonomous AI agents, such as [babyAGI](https://github.com/yoheinakajima/babyagi) and [AutoGPT](https://github.com/Significant-Gravitas/Auto-GPT).
4343

4444
**✔️ Important**: this is a "community-maintained" library and, as such, has no relation to OpenAI company.
4545

4646
## Installation 🚀
4747

4848
The currently supported Scala versions are **2.12, 2.13**, and **3**.
4949

50-
To pull the library you have to add the following dependency to your *build.sbt*
50+
To install the library, add the following dependency to your *build.sbt*
5151

5252
```
5353
"io.cequence" %% "openai-scala-client" % "1.0.0.RC.1"
@@ -63,7 +63,7 @@ or to *pom.xml* (if you use maven)
6363
</dependency>
6464
```
6565

66-
If you want a streaming support use `"io.cequence" %% "openai-scala-client-stream" % "1.0.0.RC.1"` instead.
66+
If you want streaming support, use `"io.cequence" %% "openai-scala-client-stream" % "1.0.0.RC.1"` instead.
6767

6868
## Config ⚙️
6969

@@ -385,8 +385,9 @@ class MyCompletionService extends OpenAICountTokensHelper {
385385

386386
**III. Using adapters**
387387

388-
Adapters for OpenAI services (chat completion, core, or full) are provided by [OpenAIServiceAdapters](./openai-core/src/main/scala/io/cequence/openaiscala/service/adapter/OpenAIServiceAdapters). The adapters are used to distribute the load between multiple services, retry on transient errors, route, or provide additional functionality. See [examples](./openai-examples/src/main/scala/io/cequence/openaiscala/examples/adapter) for more details.
389-
Note that adapters can be arbitrarily combined/stacked.
388+
Adapters for OpenAI services (chat completion, core, or full) are provided by [OpenAIServiceAdapters](./openai-core/src/main/scala/io/cequence/openaiscala/service/adapter/OpenAIServiceAdapters.scala). The adapters are used to distribute the load between multiple services, retry on transient errors, route, or provide additional functionality. See [examples](./openai-examples/src/main/scala/io/cequence/openaiscala/examples/adapter) for more details.
389+
390+
Note that the adapters can be arbitrarily combined/stacked.
390391

391392
- **Round robin** load distribution
392393

0 commit comments

Comments
 (0)