Skip to content

Commit 82603af

Browse files
committed
merge upstream and resolve conflicts
2 parents 0aa171b + 2348e8e commit 82603af

31 files changed

+564
-260
lines changed

.scalafix.conf

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
rules = [
2+
DisableSyntax,
3+
ExplicitResultTypes,
4+
LeakingImplicitClassVal,
5+
NoAutoTupling,
6+
NoValInForComprehension,
7+
ProcedureSyntax,
8+
RedundantSyntax,
9+
RemoveUnused
10+
]

.scalafmt.conf

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
version = 3.7.4
2+
runner.dialect = scala213

README.md

Lines changed: 58 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,11 @@
11
# OpenAI Scala Client 🤖
2-
[![version](https://img.shields.io/badge/version-0.3.3-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) ![GitHub Stars](https://img.shields.io/github/stars/cequence-io/openai-scala-client?style=social) [![Twitter Follow](https://img.shields.io/twitter/follow/0xbnd?style=social)](https://twitter.com/0xbnd)
2+
[![version](https://img.shields.io/badge/version-0.4.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) ![GitHub Stars](https://img.shields.io/github/stars/cequence-io/openai-scala-client?style=social) [![Twitter Follow](https://img.shields.io/twitter/follow/0xbnd?style=social)](https://twitter.com/0xbnd)
33

44
This is a no-nonsense async Scala client for OpenAI API supporting all the available endpoints and params **including streaming**, the newest **ChatGPT completion**, and **voice routines** (as defined [here](https://beta.openai.com/docs/api-reference)), provided in a single, convenient service called [OpenAIService](./openai-core/src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala). The supported calls are:
55

66
* **Models**: [listModels](https://platform.openai.com/docs/api-reference/models/list), and [retrieveModel](https://platform.openai.com/docs/api-reference/models/retrieve)
77
* **Completions**: [createCompletion](https://platform.openai.com/docs/api-reference/completions/create)
8-
* **Chat Completions**: [createChatCompletion](https://platform.openai.com/docs/api-reference/chat/create)
8+
* **Chat Completions**: [createChatCompletion](https://platform.openai.com/docs/api-reference/chat/create), and [createChatFunCompletion](https://platform.openai.com/docs/api-reference/chat/create) **(🔥 new)**
99
* **Edits**: [createEdit](https://platform.openai.com/docs/api-reference/edits/create)
1010
* **Images**: [createImage](https://platform.openai.com/docs/api-reference/images/create), [createImageEdit](https://platform.openai.com/docs/api-reference/images/create-edit), and [createImageVariation](https://platform.openai.com/docs/api-reference/images/create-variation)
1111
* **Embeddings**: [createEmbeddings](https://platform.openai.com/docs/api-reference/embeddings/create)
@@ -30,7 +30,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
3030
To pull the library you have to add the following dependency to your *build.sbt*
3131

3232
```
33-
"io.cequence" %% "openai-scala-client" % "0.3.3"
33+
"io.cequence" %% "openai-scala-client" % "0.4.0"
3434
```
3535

3636
or to *pom.xml* (if you use maven)
@@ -39,11 +39,11 @@ or to *pom.xml* (if you use maven)
3939
<dependency>
4040
<groupId>io.cequence</groupId>
4141
<artifactId>openai-scala-client_2.12</artifactId>
42-
<version>0.3.3</version>
42+
<version>0.4.0</version>
4343
</dependency>
4444
```
4545

46-
If you want a streaming support use `"io.cequence" %% "openai-scala-client-stream" % "0.3.3"` instead.
46+
If you want a streaming support use `"io.cequence" %% "openai-scala-client-stream" % "0.4.0"` instead.
4747

4848
## Config ⚙️
4949

@@ -170,7 +170,7 @@ Examples:
170170
println(completion.choices.head.text)
171171
).runWith(Sink.ignore)
172172
```
173-
(For this to work you need to use `OpenAIServiceStreamedFactory` from `openai-scala-client-stream` lib)
173+
For this to work you need to use `OpenAIServiceStreamedFactory` from `openai-scala-client-stream` lib.
174174

175175
- Create chat completion
176176

@@ -194,23 +194,70 @@ Examples:
194194
}
195195
```
196196

197+
- Create chat completion for functions (🔥 new)
198+
199+
```scala
200+
val messages = Seq(
201+
FunMessageSpec(role = ChatRole.User, content = Some("What's the weather like in Boston?")),
202+
)
203+
204+
// as a param type we can use "number", "string", "boolean", "object", "array", and "null"
205+
val functions = Seq(
206+
FunctionSpec(
207+
name = "get_current_weather",
208+
description = Some("Get the current weather in a given location"),
209+
parameters = Map(
210+
"type" -> "object",
211+
"properties" -> Map(
212+
"location" -> Map(
213+
"type" -> "string",
214+
"description" -> "The city and state, e.g. San Francisco, CA",
215+
),
216+
"unit" -> Map(
217+
"type" -> "string",
218+
"enum" -> Seq("celsius", "fahrenheit")
219+
)
220+
),
221+
"required" -> Seq("location"),
222+
)
223+
)
224+
)
225+
226+
// if we want to force the model to use the above function as a response
227+
// we can do so by passing: responseFunctionName = Some("get_current_weather")`
228+
service.createChatFunCompletion(
229+
messages = messages,
230+
functions = functions,
231+
responseFunctionName = None
232+
).map { response =>
233+
val chatFunCompletionMessage = response.choices.head.message
234+
val functionCall = chatFunCompletionMessage.function_call
235+
236+
println("function call name : " + functionCall.map(_.name).getOrElse("N/A"))
237+
println("function call arguments : " + functionCall.map(_.arguments).getOrElse("N/A"))
238+
}
239+
```
240+
Note that instead of `MessageSpec`, the `function_call` version of the chat completion uses the `FunMessageSpec` class to define messages - both as part of the request and the response.
241+
This extension of the standard chat completion is currently supported by the following `0613` models, all conveniently available in `ModelId` object:
242+
- `gpt-3.5-turbo-0613` (default), `gpt-3.5-turbo-16k-0613`, `gpt-4-0613`, and `gpt-4-32k-0613`.
243+
197244

198245
**✔️ Important Note**: After you are done using the service, you should close it by calling (🔥 new) `service.close`. Otherwise, the underlying resources/threads won't be released.
199246

200247
**III. Using multiple services (🔥 new)**
201248

202-
- Load distribution with `OpenAIMultiServiceAdapter` - _rotation type_ aka "round robin"
249+
- Load distribution with `OpenAIMultiServiceAdapter` - _round robin_ (_rotation_) type
203250

204251
```scala
205252
val service1 = OpenAIServiceFactory("your-api-key1")
206253
val service2 = OpenAIServiceFactory("your-api-key2")
207254
val service3 = OpenAIServiceFactory("your-api-key3")
208255

209-
val service = OpenAIMultiServiceAdapter.ofRotationType(service1, service2, service3)
256+
val service = OpenAIMultiServiceAdapter.ofRoundRobinType(service1, service2, service3)
210257

211258
service.listModels.map { models =>
212259
models.foreach(println)
213-
service.close
260+
service.close()
214261
}
215262
```
216263

@@ -221,11 +268,11 @@ Examples:
221268
val service2 = OpenAIServiceFactory("your-api-key2")
222269
val service3 = OpenAIServiceFactory("your-api-key3")
223270

224-
val service = OpenAIMultiServiceAdapter.ofRandomAccessType(service1, service2, service3)
271+
val service = OpenAIMultiServiceAdapter.ofRandomOrderType(service1, service2, service3)
225272

226273
service.listModels.map { models =>
227274
models.foreach(println)
228-
service.close
275+
service.close()
229276
}
230277
```
231278

build.sbt

Lines changed: 57 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,14 @@
11
import sbt.Keys.test
22

33
// Supported versions
4-
val scala212 = "2.12.15"
5-
val scala213 = "2.13.10"
4+
val scala212 = "2.12.18"
5+
val scala213 = "2.13.11"
66
val scala3 = "3.2.2"
77
val AkkaVersion = "2.6.1"
88

99
ThisBuild / organization := "io.cequence"
1010
ThisBuild / scalaVersion := scala212
11-
ThisBuild / version := "0.3.3"
11+
ThisBuild / version := "0.4.0"
1212
ThisBuild / isSnapshot := false
1313

1414
lazy val commonSettings = Seq(
@@ -36,16 +36,27 @@ lazy val guice = (project in file("openai-guice"))
3636
.dependsOn(client)
3737
.aggregate(client_stream)
3838

39-
4039
// POM settings for Sonatype
41-
ThisBuild / homepage := Some(url("https://github.com/cequence-io/openai-scala-client"))
40+
ThisBuild / homepage := Some(
41+
url("https://github.com/cequence-io/openai-scala-client")
42+
)
4243

4344
ThisBuild / sonatypeProfileName := "io.cequence"
4445

45-
ThisBuild / scmInfo := Some(ScmInfo(url("https://github.com/cequence-io/openai-scala-client"), "scm:[email protected]:cequence-io/openai-scala-client.git"))
46+
ThisBuild / scmInfo := Some(
47+
ScmInfo(
48+
url("https://github.com/cequence-io/openai-scala-client"),
49+
"scm:[email protected]:cequence-io/openai-scala-client.git"
50+
)
51+
)
4652

4753
ThisBuild / developers := List(
48-
Developer("bnd", "Peter Banda", "[email protected]", url("https://peterbanda.net"))
54+
Developer(
55+
"bnd",
56+
"Peter Banda",
57+
58+
url("https://peterbanda.net")
59+
)
4960
)
5061

5162
ThisBuild / licenses += "MIT" -> url("https://opensource.org/licenses/MIT")
@@ -57,3 +68,42 @@ ThisBuild / sonatypeCredentialHost := "s01.oss.sonatype.org"
5768
ThisBuild / sonatypeRepository := "https://s01.oss.sonatype.org/service/local"
5869

5970
ThisBuild / publishTo := sonatypePublishToBundle.value
71+
72+
addCommandAlias(
73+
"validateCode",
74+
List(
75+
"scalafix",
76+
"scalafmtSbtCheck",
77+
"scalafmtCheckAll",
78+
"test:scalafix",
79+
"test:scalafmtCheckAll"
80+
).mkString(";")
81+
)
82+
83+
addCommandAlias(
84+
"formatCode",
85+
List(
86+
"scalafmt",
87+
"scalafmtSbt",
88+
"Test/scalafmt"
89+
).mkString(";")
90+
)
91+
92+
addCommandAlias(
93+
"testWithCoverage",
94+
List(
95+
"coverage",
96+
"test",
97+
"coverageReport"
98+
).mkString(";")
99+
)
100+
101+
102+
inThisBuild(
103+
List(
104+
scalacOptions += "-Ywarn-unused",
105+
scalaVersion := "2.12.15",
106+
semanticdbEnabled := true,
107+
semanticdbVersion := scalafixSemanticdb.revision
108+
)
109+
)

openai-client-stream/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# OpenAI Scala Client - Stream Support [![version](https://img.shields.io/badge/version-0.3.3-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
1+
# OpenAI Scala Client - Stream Support [![version](https://img.shields.io/badge/version-0.4.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
22

33
This module provides streaming support for the client. Note that the full project documentation can be found [here](../README.md).
44

@@ -9,7 +9,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
99
To pull the library you have to add the following dependency to your *build.sbt*
1010

1111
```
12-
"io.cequence" %% "openai-scala-client-stream" % "0.3.3"
12+
"io.cequence" %% "openai-scala-client-stream" % "0.4.0"
1313
```
1414

1515
or to *pom.xml* (if you use maven)
@@ -18,6 +18,6 @@ or to *pom.xml* (if you use maven)
1818
<dependency>
1919
<groupId>io.cequence</groupId>
2020
<artifactId>openai-scala-client-stream_2.12</artifactId>
21-
<version>0.3.3</version>
21+
<version>0.4.0</version>
2222
</dependency>
2323
```

openai-client-stream/src/main/scala/io/cequence/openaiscala/service/OpenAIServiceStreamedExtra.scala

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@ package io.cequence.openaiscala.service
22

33
import akka.NotUsed
44
import akka.stream.scaladsl.Source
5-
import io.cequence.openaiscala.domain.MessageSpec
5+
import io.cequence.openaiscala.domain.{FunctionSpec, MessageSpec}
66
import io.cequence.openaiscala.domain.response.{ChatCompletionChunkResponse, ChatCompletionResponse, FineTuneEvent, TextCompletionResponse}
77
import io.cequence.openaiscala.domain.settings.{CreateChatCompletionSettings, CreateCompletionSettings}
88

@@ -29,7 +29,7 @@ trait OpenAIServiceStreamedExtra extends OpenAIServiceConsts {
2929
/**
3030
* Creates a completion for the chat message(s) with streamed results.
3131
*
32-
* @param messages The messages to generate chat completions.
32+
* @param messages A list of messages comprising the conversation so far.
3333
* @param settings
3434
* @return chat completion response
3535
*

openai-client-stream/src/main/scala/io/cequence/openaiscala/service/OpenAIServiceStreamedImpl.scala

Lines changed: 5 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -15,12 +15,7 @@ import play.api.libs.json.{JsValue, Json}
1515
import scala.concurrent.ExecutionContext
1616

1717
/**
18-
* Private impl. class of [[OpenAIService]].
19-
*
20-
* @param apiKey
21-
* @param orgId
22-
* @param ec
23-
* @param materializer
18+
* Private impl. class of [[OpenAIServiceStreamedExtra]] which offers extra functions with streaming support.
2419
*
2520
* @since Jan 2023
2621
*/
@@ -32,7 +27,7 @@ private trait OpenAIServiceStreamedExtraImpl extends OpenAIServiceStreamedExtra
3227
settings: CreateCompletionSettings
3328
): Source[TextCompletionResponse, NotUsed] =
3429
execJsonStreamAux(
35-
Command.completions,
30+
EndPoint.completions,
3631
"POST",
3732
bodyParams = createBodyParamsForCompletion(prompt, settings, stream = true)
3833
).map { (json: JsValue) =>
@@ -48,7 +43,7 @@ private trait OpenAIServiceStreamedExtraImpl extends OpenAIServiceStreamedExtra
4843
settings: CreateChatCompletionSettings = DefaultSettings.CreateChatCompletion
4944
): Source[ChatCompletionChunkResponse, NotUsed] =
5045
execJsonStreamAux(
51-
Command.chat_completions,
46+
EndPoint.chat_completions,
5247
"POST",
5348
bodyParams = createBodyParamsForChatCompletion(messages, settings, stream = true)
5449
).map { (json: JsValue) =>
@@ -63,11 +58,11 @@ private trait OpenAIServiceStreamedExtraImpl extends OpenAIServiceStreamedExtra
6358
fineTuneId: String
6459
): Source[FineTuneEvent, NotUsed] =
6560
execJsonStreamAux(
66-
Command.fine_tunes,
61+
EndPoint.fine_tunes,
6762
"GET",
6863
endPointParam = Some(s"$fineTuneId/events"),
6964
params = Seq(
70-
Tag.stream -> Some(true)
65+
Param.stream -> Some(true)
7166
)
7267
).map { json =>
7368
(json \ "error").toOption.map { error =>

openai-client/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# OpenAI Scala Client - Client [![version](https://img.shields.io/badge/version-0.3.3-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
1+
# OpenAI Scala Client - Client [![version](https://img.shields.io/badge/version-0.4.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
22

33
This module provides the actual meat, i.e. WS client implementation ([OpenAIServiceImpl and OpenAIServiceFactory](./src/main/scala/io/cequence/openaiscala/service/OpenAIServiceImpl.scala)).
44
Note that the full project documentation can be found [here](../README.md).
@@ -10,7 +10,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
1010
To pull the library you have to add the following dependency to your *build.sbt*
1111

1212
```
13-
"io.cequence" %% "openai-scala-client" % "0.3.3"
13+
"io.cequence" %% "openai-scala-client" % "0.4.0"
1414
```
1515

1616
or to *pom.xml* (if you use maven)
@@ -19,6 +19,6 @@ or to *pom.xml* (if you use maven)
1919
<dependency>
2020
<groupId>io.cequence</groupId>
2121
<artifactId>openai-scala-client_2.12</artifactId>
22-
<version>0.3.3</version>
22+
<version>0.4.0</version>
2323
</dependency>
2424
```

openai-client/build.sbt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,8 +6,8 @@ lazy val playWsVersion = settingKey[String]("Play WS version to use")
66

77
playWsVersion := {
88
scalaVersion.value match {
9-
case "2.12.15" => "2.1.10"
10-
case "2.13.10" => "2.2.0-M3"
9+
case "2.12.18" => "2.1.10"
10+
case "2.13.11" => "2.2.0-M3"
1111
case "3.2.2" => "2.2.0-M2" // Version "2.2.0-M3" was produced by an unstable release: Scala 3.3.0-RC3
1212
case _ => "2.1.10"
1313
}

0 commit comments

Comments
 (0)