Skip to content

Update RTD links with https / .io #1074

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Apr 10, 2017
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 6 additions & 6 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ Kafka Python client
------------------------

.. image:: https://img.shields.io/badge/kafka-0.10%2C%200.9%2C%200.8.2%2C%200.8.1%2C%200.8-brightgreen.svg
:target: https://kafka-python.readthedocs.org/compatibility.html
:target: https://kafka-python.readthedocs.io/compatibility.html
.. image:: https://img.shields.io/pypi/pyversions/kafka-python.svg
:target: https://pypi.python.org/pypi/kafka-python
.. image:: https://coveralls.io/repos/dpkp/kafka-python/badge.svg?branch=master&service=github
Expand All @@ -26,7 +26,7 @@ check code (perhaps using zookeeper or consul). For older brokers, you can
achieve something similar by manually assigning different partitions to each
consumer instance with config management tools like chef, ansible, etc. This
approach will work fine, though it does not support rebalancing on failures.
See <http://kafka-python.readthedocs.org/en/master/compatibility.html>
See <https://kafka-python.readthedocs.io/en/master/compatibility.html>
for more details.

Please note that the master branch may contain unreleased features. For release
Expand All @@ -41,7 +41,7 @@ KafkaConsumer is a high-level message consumer, intended to operate as similarly
as possible to the official java client. Full support for coordinated
consumer groups requires use of kafka brokers that support the Group APIs: kafka v0.9+.

See <http://kafka-python.readthedocs.org/en/master/apidoc/KafkaConsumer.html>
See <https://kafka-python.readthedocs.io/en/master/apidoc/KafkaConsumer.html>
for API and configuration details.

The consumer iterator returns ConsumerRecords, which are simple namedtuples
Expand Down Expand Up @@ -76,7 +76,7 @@ KafkaProducer

KafkaProducer is a high-level, asynchronous message producer. The class is
intended to operate as similarly as possible to the official java client.
See <http://kafka-python.readthedocs.org/en/master/apidoc/KafkaProducer.html>
See <https://kafka-python.readthedocs.io/en/master/apidoc/KafkaProducer.html>
for more details.

>>> from kafka import KafkaProducer
Expand Down Expand Up @@ -116,7 +116,7 @@ Compression
kafka-python supports gzip compression/decompression natively. To produce or consume lz4
compressed messages, you should install python-lz4 (pip install lz4).
To enable snappy compression/decompression install python-snappy (also requires snappy library).
See <http://kafka-python.readthedocs.org/en/master/install.html#optional-snappy-install>
See <https://kafka-python.readthedocs.io/en/master/install.html#optional-snappy-install>
for more information.

Protocol
Expand All @@ -135,4 +135,4 @@ Low-level

Legacy support is maintained for low-level consumer and producer classes,
SimpleConsumer and SimpleProducer. See
<http://kafka-python.readthedocs.io/en/master/simple.html?highlight=SimpleProducer> for API details.
<https://kafka-python.readthedocs.io/en/master/simple.html?highlight=SimpleProducer> for API details.