You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+21-14Lines changed: 21 additions & 14 deletions
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,5 @@
1
1
# scalatest-embedded-kafka
2
-
A library that provides an in-memory Kafka broker to run your ScalaTest specs against. It uses Kafka 0.10.1.0 and ZooKeeper 3.4.9.
2
+
A library that provides an in-memory Kafka broker to run your ScalaTest specs against. It uses Kafka 0.10.1.1 and ZooKeeper 3.4.8.
3
3
4
4
The version supporting Kafka 0.8.x can be found [here](https://github.com/manub/scalatest-embedded-kafka/tree/kafka-0.8) - *this is no longer actively supported, although I'll be happy to accept PRs and produce releases.*
5
5
@@ -13,11 +13,18 @@ Inspired by https://github.com/chbatey/kafka-unit
scalatest-embedded-kafka is available on Bintray and Maven Central, compiled for both Scala 2.10 and 2.11
17
+
### Version compatibility matrix
19
18
20
-
* In your `build.sbt` file add the following dependency: `"net.manub" %% "scalatest-embedded-kafka" % "0.10.0" % "test"`
19
+
scalatest-embedded-kafka is available on Bintray and Maven Central, compiled for both Scala 2.11 and 2.12.
20
+
21
+
* Scala 2.10 is supported until `0.10.0`
22
+
* Scala 2.11 is supported for all versions
23
+
* Scala 2.12 is supported from `0.11.0`. Please noted that currently kafka support for 2.12 is marked as pre-alpha from the kafka team.
24
+
25
+
### How to use
26
+
27
+
* In your `build.sbt` file add the following dependency: `"net.manub" %% "scalatest-embedded-kafka" % "0.11.0" % "test"`
21
28
* Have your `Spec` extend the `EmbeddedKafka` trait.
22
29
* Enclose the code that needs a running instance of Kafka within the `withRunningKafka` closure.
23
30
```scala
@@ -53,7 +60,7 @@ class MySpec extends WordSpec {
53
60
54
61
Please note that in order to avoid Kafka instances not shutting down properly, it's recommended to call `EmbeddedKafka.stop()` in a `after` block or in a similar teardown logic.
55
62
56
-
##Configuration
63
+
###Configuration
57
64
58
65
It's possible to change the ports on which Zookeeper and Kafka are started by providing an implicit `EmbeddedKafkaConfig`
59
66
@@ -79,7 +86,7 @@ Those properties will be added to the broker configuration, be careful some prop
79
86
in case of conflict the `customBrokerProperties` values will take precedence. Please look at the source code to see what these properties
80
87
are.
81
88
82
-
##Utility methods
89
+
###Utility methods
83
90
84
91
The `EmbeddedKafka` traitprovides also some utility methods to interact with the embedded kafka, in order to set preconditions or verifications in your specs:
It is possible to create producers for custom types in two ways:
97
104
@@ -100,15 +107,15 @@ It is possible to create producers for custom types in two ways:
100
107
101
108
For more information about how to use the utility methods, you can either look at the Scaladocs or at the tests of this project.
102
109
103
-
##Custom consumers
110
+
###Custom consumers
104
111
105
112
Use the `Consumer` traitthat easily creates consumers of arbitrary key-value types and manages their lifecycle (via a loaner pattern).
106
113
*For basic String consumption use `Consumer.withStringConsumer { your code here }`.
107
114
*For arbitrary key and value types, expose implicit `Deserializer`s for each typeand use `Consumer.withConsumer { your code here }`.
108
115
*If you just want to create a consumer and manage its lifecycle yourself thentry `Consumer.newConsumer()`.
109
116
117
+
###Easy message consumption
110
118
111
-
##Easy message consumption
112
119
With `ConsumerExtensions` you can turn a consumer to a ScalalazyStream of key-value pairs and treat it asa collection for easy assertion.
113
120
*Justimporttheextensions.
114
121
*On any `KafkaConsumer` instance you can now do:
@@ -123,16 +130,16 @@ consumer.consumeLazily("from-this-topic").take(3).toList should be (Seq(
123
130
)
124
131
```
125
132
126
-
127
-
# scalatest-embedded-kafka-streams
133
+
## scalatest-embedded-kafka-streams
128
134
129
135
A library that builds on top of `scalatest-embedded-kafka` to offer easy testing of [KafkaStreams](https://cwiki.apache.org/confluence/display/KAFKA/Kafka+Streams) withScalaTest.
130
-
It uses KafkaStreams0.10.0.1.
136
+
It uses KafkaStreams0.10.1.1.
137
+
131
138
It takes care of instantiating and starting your streams aswellasclosing them after running your test-case code.
132
139
133
-
##How to use
140
+
###How to use
134
141
135
-
*In your `build.sbt` file add the following dependency: `"net.manub" %% "scalatest-embedded-kafka-streams" % "0.9.0" % "test"`
142
+
*In your `build.sbt` file add the following dependency: `"net.manub" %% "scalatest-embedded-kafka-streams" % "0.11.0" % "test"`
136
143
*Have a look at the [example test](kafka-streams/src/test/scala/net/manub/embeddedkafka/streams/ExampleKafkaStreamsSpec.scala)
137
144
*For most of the cases have your `Spec` extend the `EmbeddedKafkaStreamsAllInOne` trait. This offers both streams management and easy creation of consumers for asserting resulting messages in output/sink topics.
138
145
*If you only want to use the streams management without the test consumers just have the `Spec` extend the `EmbeddedKafkaStreams` trait.
0 commit comments