Skip to content

Commit 6384f3f

Browse files
authored
Support Scala 2.12 - fixes #48. (#63)
1 parent cd2fc4a commit 6384f3f

File tree

3 files changed

+31
-21
lines changed

3 files changed

+31
-21
lines changed

.travis.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,8 +4,8 @@ jdk:
44
- oraclejdk8
55

66
scala:
7-
- 2.10.6
87
- 2.11.8
8+
- 2.12.1
99

1010
sudo: false
1111

README.md

Lines changed: 21 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# scalatest-embedded-kafka
2-
A library that provides an in-memory Kafka broker to run your ScalaTest specs against. It uses Kafka 0.10.1.0 and ZooKeeper 3.4.9.
2+
A library that provides an in-memory Kafka broker to run your ScalaTest specs against. It uses Kafka 0.10.1.1 and ZooKeeper 3.4.8.
33

44
The version supporting Kafka 0.8.x can be found [here](https://github.com/manub/scalatest-embedded-kafka/tree/kafka-0.8) - *this is no longer actively supported, although I'll be happy to accept PRs and produce releases.*
55

@@ -13,11 +13,18 @@ Inspired by https://github.com/chbatey/kafka-unit
1313

1414
[![License](http://img.shields.io/:license-mit-blue.svg)](http://doge.mit-license.org)
1515

16-
## How to use
1716

18-
scalatest-embedded-kafka is available on Bintray and Maven Central, compiled for both Scala 2.10 and 2.11
17+
### Version compatibility matrix
1918

20-
* In your `build.sbt` file add the following dependency: `"net.manub" %% "scalatest-embedded-kafka" % "0.10.0" % "test"`
19+
scalatest-embedded-kafka is available on Bintray and Maven Central, compiled for both Scala 2.11 and 2.12.
20+
21+
* Scala 2.10 is supported until `0.10.0`
22+
* Scala 2.11 is supported for all versions
23+
* Scala 2.12 is supported from `0.11.0`. Please noted that currently kafka support for 2.12 is marked as pre-alpha from the kafka team.
24+
25+
### How to use
26+
27+
* In your `build.sbt` file add the following dependency: `"net.manub" %% "scalatest-embedded-kafka" % "0.11.0" % "test"`
2128
* Have your `Spec` extend the `EmbeddedKafka` trait.
2229
* Enclose the code that needs a running instance of Kafka within the `withRunningKafka` closure.
2330
```scala
@@ -53,7 +60,7 @@ class MySpec extends WordSpec {
5360

5461
Please note that in order to avoid Kafka instances not shutting down properly, it's recommended to call `EmbeddedKafka.stop()` in a `after` block or in a similar teardown logic.
5562

56-
## Configuration
63+
### Configuration
5764

5865
It's possible to change the ports on which Zookeeper and Kafka are started by providing an implicit `EmbeddedKafkaConfig`
5966

@@ -79,7 +86,7 @@ Those properties will be added to the broker configuration, be careful some prop
7986
in case of conflict the `customBrokerProperties` values will take precedence. Please look at the source code to see what these properties
8087
are.
8188

82-
## Utility methods
89+
### Utility methods
8390

8491
The `EmbeddedKafka` trait provides also some utility methods to interact with the embedded kafka, in order to set preconditions or verifications in your specs:
8592

@@ -91,7 +98,7 @@ def consumeFirstMessageFrom(topic: String): String
9198
def createCustomTopic(topic: String, topicConfig: Map[String,String], partitions: Int, replicationFactor: Int): Unit
9299
```
93100

94-
## Custom producers
101+
### Custom producers
95102

96103
It is possible to create producers for custom types in two ways:
97104

@@ -100,15 +107,15 @@ It is possible to create producers for custom types in two ways:
100107

101108
For more information about how to use the utility methods, you can either look at the Scaladocs or at the tests of this project.
102109

103-
## Custom consumers
110+
### Custom consumers
104111

105112
Use the `Consumer` trait that easily creates consumers of arbitrary key-value types and manages their lifecycle (via a loaner pattern).
106113
* For basic String consumption use `Consumer.withStringConsumer { your code here }`.
107114
* For arbitrary key and value types, expose implicit `Deserializer`s for each type and use `Consumer.withConsumer { your code here }`.
108115
* If you just want to create a consumer and manage its lifecycle yourself then try `Consumer.newConsumer()`.
109116

117+
### Easy message consumption
110118

111-
## Easy message consumption
112119
With `ConsumerExtensions` you can turn a consumer to a Scala lazy Stream of key-value pairs and treat it as a collection for easy assertion.
113120
* Just import the extensions.
114121
* On any `KafkaConsumer` instance you can now do:
@@ -123,16 +130,16 @@ consumer.consumeLazily("from-this-topic").take(3).toList should be (Seq(
123130
)
124131
```
125132

126-
127-
# scalatest-embedded-kafka-streams
133+
## scalatest-embedded-kafka-streams
128134

129135
A library that builds on top of `scalatest-embedded-kafka` to offer easy testing of [Kafka Streams](https://cwiki.apache.org/confluence/display/KAFKA/Kafka+Streams) with ScalaTest.
130-
It uses Kafka Streams 0.10.0.1.
136+
It uses Kafka Streams 0.10.1.1.
137+
131138
It takes care of instantiating and starting your streams as well as closing them after running your test-case code.
132139

133-
## How to use
140+
### How to use
134141

135-
* In your `build.sbt` file add the following dependency: `"net.manub" %% "scalatest-embedded-kafka-streams" % "0.9.0" % "test"`
142+
* In your `build.sbt` file add the following dependency: `"net.manub" %% "scalatest-embedded-kafka-streams" % "0.11.0" % "test"`
136143
* Have a look at the [example test](kafka-streams/src/test/scala/net/manub/embeddedkafka/streams/ExampleKafkaStreamsSpec.scala)
137144
* For most of the cases have your `Spec` extend the `EmbeddedKafkaStreamsAllInOne` trait. This offers both streams management and easy creation of consumers for asserting resulting messages in output/sink topics.
138145
* If you only want to use the streams management without the test consumers just have the `Spec` extend the `EmbeddedKafkaStreams` trait.

build.sbt

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -2,15 +2,18 @@ import sbtrelease.Version
22

33
parallelExecution in ThisBuild := false
44

5-
val kafkaVersion = "0.10.1.0"
5+
val kafkaVersion = "0.10.1.1"
6+
val akkaVersion = "2.4.14"
67

78
val slf4jLog4jOrg = "org.slf4j"
89
val slf4jLog4jArtifact = "slf4j-log4j12"
910

11+
resolvers in ThisBuild += "Apache Staging" at "https://repository.apache.org/content/groups/staging"
12+
1013
lazy val commonSettings = Seq(
1114
organization := "net.manub",
1215
scalaVersion := "2.11.8",
13-
crossScalaVersions := Seq("2.10.6", "2.11.8"),
16+
crossScalaVersions := Seq("2.12.1", "2.11.8"),
1417
homepage := Some(url("https://github.com/manub/scalatest-embedded-kafka")),
1518
parallelExecution in Test := false,
1619
logBuffered in Test := false,
@@ -20,12 +23,12 @@ lazy val commonSettings = Seq(
2023

2124

2225
lazy val commonLibrarySettings = libraryDependencies ++= Seq(
23-
"org.scalatest" %% "scalatest" % "3.0.0",
26+
"org.scalatest" %% "scalatest" % "3.0.1",
2427
"org.apache.kafka" %% "kafka" % kafkaVersion exclude(slf4jLog4jOrg, slf4jLog4jArtifact),
25-
"org.apache.zookeeper" % "zookeeper" % "3.4.7" exclude(slf4jLog4jOrg, slf4jLog4jArtifact),
28+
"org.apache.zookeeper" % "zookeeper" % "3.4.8" exclude(slf4jLog4jOrg, slf4jLog4jArtifact),
2629
"org.apache.avro" % "avro" % "1.7.7" exclude(slf4jLog4jOrg, slf4jLog4jArtifact),
27-
"com.typesafe.akka" %% "akka-actor" % "2.3.14" % Test,
28-
"com.typesafe.akka" %% "akka-testkit" % "2.3.14" % Test
30+
"com.typesafe.akka" %% "akka-actor" % akkaVersion % Test,
31+
"com.typesafe.akka" %% "akka-testkit" % akkaVersion % Test
2932
)
3033

3134
lazy val publishSettings = Seq(

0 commit comments

Comments
 (0)