Skip to content

Commit d1d99f9

Browse files
biniona-mongodbschmalliso
authored andcommitted
(DOCSP-19377) Section Title Audit (#195)
1 parent 2881b5a commit d1d99f9

File tree

10 files changed

+36
-33
lines changed

10 files changed

+36
-33
lines changed

source/introduction/connect.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -48,8 +48,8 @@ To learn more about this configuration option, see the following resources:
4848
- :doc:`Source connector configuration options </source-connector/configuration-properties/mongodb-connection/>`
4949
- :doc:`Sink connector configuration options </sink-connector/configuration-properties/mongodb-connection/>`
5050

51-
Connection URI Options
52-
----------------------
51+
How to Configure Your Connection
52+
--------------------------------
5353

5454
The {+mkc+} uses the **MongoDB Java driver** to parse your connection URI.
5555
The MongoDB Java driver is an artifact that enables Java applications like {+kc+}

source/introduction/install.txt

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -49,7 +49,7 @@ connector on Confluent Platform:
4949
<https://docs.confluent.io/home/connect/community.html>`__.
5050

5151
2. Use the connector GitHub URL and uber JAR locations in the
52-
:ref:`installation reference table <kafka-connector-installation-reference>`
52+
:ref:`reference table <kafka-connector-installation-reference>`
5353
when appropriate in the Confluent manual installation instructions.
5454

5555
.. _kafka-connector-install-apache:
@@ -58,8 +58,8 @@ Install the Connector on Apache Kafka
5858
-------------------------------------
5959

6060
1. Locate and download the uber JAR to get all the dependencies required
61-
for the connector. Refer to the
62-
:ref:`installation reference table <kafka-connector-installation-reference>`
61+
for the connector. Check the
62+
:ref:`reference table <kafka-connector-installation-reference>`
6363
to find the uber JAR.
6464

6565
.. note::
@@ -81,13 +81,13 @@ Install the Connector on Apache Kafka
8181
If you intend to run the connector as distributed worker processes, you
8282
must repeat this process for each server or virtual machine.
8383

84-
.. _kafka-connector-installation-reference:
85-
86-
Installation Reference Table
87-
----------------------------
84+
Download a Connector JAR File
85+
-----------------------------
8886

8987
You can download the {+mkc+} source and JAR files from the following locations:
9088

89+
.. _kafka-connector-installation-reference:
90+
9191
.. list-table::
9292
:widths: 55 45
9393
:stub-columns: 1

source/quick-start.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -34,8 +34,8 @@ Read the following sections to set up your sandbox and sample data pipeline.
3434
following the instructions in the :ref:`<kafka-quickstart-remove-the-sandbox>`
3535
section.
3636

37-
Requirements
38-
------------
37+
Install the Required Packages
38+
-----------------------------
3939

4040
.. include:: /includes/tutorials/pipeline-requirements.rst
4141

source/sink-connector/configuration-properties/error-handling.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -142,3 +142,4 @@ the dead letter queue messages should include context headers.
142142
errors.deadletterqueue.topic.name=example.deadletterqueue
143143
errors.deadletterqueue.context.headers.enable=true
144144

145+
To learn more about dead letter queues, see :ref:`<kafka-sink-errors-dlq>`.

source/sink-connector/configuration-properties/time-series.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -76,9 +76,9 @@ Settings
7676
a number. If the value is a string, the connector uses the
7777
setting in the following configuration to parse the date:
7878

79-
.. code-block:: none
79+
.. code-block:: none
8080

81-
timeseries.timefield.auto.convert.date.format
81+
timeseries.timefield.auto.convert.date.format
8282

8383
| If the connector fails to convert the value, it sends the
8484
original value to the time series collection.

source/sink-connector/fundamentals/error-handling-strategies.txt

Lines changed: 14 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -38,14 +38,14 @@ Handle Errors
3838
When your connector encounters an error, it needs to handle it in some way.
3939
Your sink connector can do the following in response to an error:
4040

41-
- :ref:`Stop <kafka-sink-errors-stop>` *default*
42-
- :ref:`Tolerate Errors <kafka-sink-tolerate-errors>`
43-
- :ref:`Write Errant Messages to a Topic (Dead Letter Queue) <kafka-sink-errors-dlq>`
41+
- :ref:`<kafka-sink-errors-stop>` *default*
42+
- :ref:`<kafka-sink-tolerate-errors>`
43+
- :ref:`<kafka-sink-errors-dlq>`
4444

4545
.. _kafka-sink-errors-stop:
4646

47-
Stop
48-
~~~~
47+
Stop For All Errors
48+
~~~~~~~~~~~~~~~~~~~
4949

5050
By default, your sink connector terminates and stops processing messages
5151
when it encounters an error. This is a good option for you if any error in
@@ -68,8 +68,8 @@ adding the following to your connector configuration:
6868

6969
.. _kafka-sink-tolerate-errors:
7070

71-
Tolerate Errors
72-
~~~~~~~~~~~~~~~
71+
Tolerate All Errors
72+
~~~~~~~~~~~~~~~~~~~
7373

7474
You can configure your sink connector to tolerate all errors and never stop
7575
processing messages. This is a good option for getting your sink connector up and
@@ -85,10 +85,10 @@ option:
8585

8686
.. _kafka-sink-errors-dlq:
8787

88-
Dead Letter Queue
89-
~~~~~~~~~~~~~~~~~
88+
Write Errors and Errant Messages to a Topic
89+
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
9090

91-
You can configure your sink connector to write errant messages to a
91+
You can configure your sink connector to write errors and errant messages to a
9292
topic, called a **dead letter queue**, for you to inspect or process further.
9393
A dead letter queue is a location in message queueing
9494
systems such as {+ak+} where the system routes errant messages instead of
@@ -131,6 +131,8 @@ errant message, use the following option:
131131
For more information, see Confluent's guide on
132132
`Dead Letter Queues <https://docs.confluent.io/cloud/current/connectors/dead-letter-queue.html#dead-letter-queue>`__.
133133

134+
To view another dead letter queue configuration example, see :ref:`<sink-dead-letter-queue-configuration-example>`.
135+
134136
.. _kafka-sink-log-errors:
135137

136138
Log Errors
@@ -172,8 +174,8 @@ For more information, see Confluent's guide on
172174

173175
.. _kakfa-sink-connector-level:
174176

175-
Connector Level Options
176-
-----------------------
177+
Handle Errors at the Connector Level
178+
------------------------------------
177179

178180
The {+mkc+} provides options that allow you to configure error
179181
handling at the connector level. The options are as follows:

source/sink-connector/fundamentals/post-processors.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -846,8 +846,8 @@ it outputs the following:
846846

847847
.. _sink-post-processors-custom:
848848

849-
Create A Custom Post Processor
850-
------------------------------
849+
How to Create a Custom Post Processor
850+
-------------------------------------
851851

852852
If the built-in post processors do not cover your use case, you can create
853853
a custom post processor class using the following steps:

source/sink-connector/fundamentals/write-strategies.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -285,8 +285,8 @@ your collection.
285285

286286
.. _kafka-sink-write-model-create-custom-strategy:
287287

288-
Create A Custom Write Model Strategy
289-
------------------------------------
288+
Custom Write Model Strategies
289+
-----------------------------
290290

291291
If none of the pre-built write model strategies fit your use case, you can create
292292
your own.

source/source-connector/usage-examples/copy-existing-data.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -96,8 +96,8 @@ To learn more about aggregation pipelines, see the following resources:
9696
- :manual:`Aggregation </aggregation>` in the MongoDB manual.
9797

9898

99-
Copy Data Configuration
100-
~~~~~~~~~~~~~~~~~~~~~~~
99+
Specify the Configuration
100+
~~~~~~~~~~~~~~~~~~~~~~~~~
101101

102102
Your source connector configuration to copy the ``customers`` collection should
103103
look like this:

source/source-connector/usage-examples/schema.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -129,8 +129,8 @@ For more information on the ``fullDocument`` field, see the
129129

130130
.. _usage-example-schema-config:
131131

132-
Custom Schema Configuration
133-
~~~~~~~~~~~~~~~~~~~~~~~~~~~
132+
Specify the Configuration
133+
~~~~~~~~~~~~~~~~~~~~~~~~~
134134

135135
Your custom schema connector configuration should resemble the following:
136136

0 commit comments

Comments
 (0)