1
- # Kakfa Python client
1
+ # Kafka Python client
2
2
3
- This module provides low-level protocol support Apache Kafka. It implements the five basic request types
4
- (and their responses): Produce, Fetch, MultiFetch, MultiProduce, and Offsets. Gzip and Snappy compression
5
- is also supported.
3
+ This module provides low-level protocol support for Apache Kafka as well as
4
+ high-level consumer and producer classes. Request batching is supported by the
5
+ protocol as well as broker-aware request routing. Gzip and Snappy compression
6
+ is also supported for message sets.
6
7
7
- Compatible with Apache Kafka 0.7x. Tested against 0.8
8
+ Compatible with Apache Kafka 0.8.0
8
9
9
- http://incubator .apache.org/kafka /
10
+ http://kafka .apache.org/
10
11
11
12
# License
12
13
13
14
Copyright 2013, David Arthur under Apache License, v2.0. See ` LICENSE `
14
15
15
16
# Status
16
17
17
- Current version is 0.2-alpha. This version is under development, APIs are subject to change
18
+ I'm following the version numbers of Kafka, plus one number to indicate the
19
+ version of this project. The current version is 0.8.0-1. This version is under
20
+ development, APIs are subject to change.
21
+
22
+ # Usage
23
+
24
+ ## High level
25
+
26
+ ``` python
27
+ from kafka.client import KafkaClient
28
+ from kafka.consumer import SimpleConsumer
29
+ from kafka.producer import SimpleProducer
30
+
31
+ kafka = KafkaClient(" localhost" , 9092 )
32
+
33
+ producer = SimpleProducer(kafka, " my-topic" )
34
+ producer.send_messages(" some message" )
35
+ producer.send_messages(" this method" , " is variadic" )
36
+
37
+ consumer = SimpleConsumer(kafka, " my-group" , " my-topic" )
38
+ for message in consumer:
39
+ print (message)
40
+
41
+ kafka.close()
42
+ ```
43
+
44
+ ## Low level
45
+
46
+ ``` python
47
+ from kafka.client import KafkaClient
48
+ kafka = KafkaClient(" localhost" , 9092 )
49
+ req = ProduceRequest(topic = " my-topic" , partition = 1 ,
50
+ messages = [KafkaProdocol.encode_message(" some message" )])
51
+ resps = kafka.send_produce_request(payloads = [req], fail_on_error = True )
52
+ kafka.close()
53
+
54
+ resps[0 ].topic # "my-topic"
55
+ resps[0 ].partition # 1
56
+ resps[0 ].error # 0 (hopefully)
57
+ resps[0 ].offset # offset of the first message sent in this request
58
+ ```
18
59
19
60
# Install
20
61
@@ -60,11 +101,14 @@ pip install python-snappy
60
101
61
102
# Tests
62
103
63
- Some of the tests will fail if Snappy is not installed. These tests will throw NotImplementedError. If you see other failures,
64
- they might be bugs - so please report them!
104
+ Some of the tests will fail if Snappy is not installed. These tests will throw
105
+ NotImplementedError. If you see other failures, they might be bugs - so please
106
+ report them!
65
107
66
108
## Run the unit tests
67
109
110
+ _ These are broken at the moment_
111
+
68
112
``` shell
69
113
python -m test.unit
70
114
```
@@ -81,46 +125,11 @@ cd kafka-src
81
125
./sbt package
82
126
```
83
127
84
- Then from the root directory, run the integration tests
128
+ Next start up a ZooKeeper server on localhost:2181
85
129
86
130
``` shell
87
- python -m test.integration
88
- ```
89
-
90
- # Usage
91
-
92
- ## High level
93
-
94
- ``` python
95
- from kafka.client import KafkaClient
96
- from kafka.consumer import SimpleConsumer
97
- from kafka.producer import SimpleProducer
98
-
99
- kafka = KafkaClient(" localhost" , 9092 )
100
-
101
- producer = SimpleProducer(kafka, " my-topic" )
102
- producer.send_messages(" some message" )
103
- producer.send_messages(" this method" , " is variadic" )
104
-
105
- consumer = SimpleConsumer(kafka, " my-group" , " my-topic" )
106
- for message in consumer:
107
- print (message)
108
-
109
- kafka.close()
131
+ /opt/zookeeper/bin/zkServer.sh start
110
132
```
111
133
112
- ## Low level
113
-
114
- ``` python
115
- from kafka.client import KafkaClient
116
- kafka = KafkaClient(" localhost" , 9092 )
117
- req = ProduceRequest(topic = " my-topic" , partition = 1 ,
118
- messages = [KafkaProdocol.encode_message(" some message" )])
119
- resps = kafka.send_produce_request(payloads = [req], fail_on_error = True )
120
- kafka.close()
121
-
122
- resps[0 ].topic # "my-topic"
123
- resps[0 ].partition # 1
124
- resps[0 ].error # 0 (hopefully)
125
- resps[0 ].offset # offset of the first message sent in this request
126
- ```
134
+ This will actually start up real Kafka brokers and send messages in using the
135
+ client.
0 commit comments