Skip to content

Commit 4b1c53e

Browse files
biniona-mongodbschmalliso
authored andcommitted
(DOCSP-19182) Replace "Link Only" TODOs + Remove "Remove TODO" TODOs (#186)
1 parent 693e67b commit 4b1c53e

File tree

18 files changed

+70
-433
lines changed

18 files changed

+70
-433
lines changed

source/includes/steps-cdc-tutorial.yaml

Lines changed: 24 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -26,8 +26,9 @@ content: |
2626
connector writes change event documents corresponding to the ``Source``
2727
collection in the ``CDCTutorial`` database to a Kafka topic. The configuration
2828
for the source connector is as follows:
29-
<TODO: Link to Source - Fundamentals - Change Streams page>
30-
29+
30+
.. _cdc-tutorial-source-connector:
31+
3132
.. code-block:: properties
3233
:copyable: false
3334
@@ -105,6 +106,8 @@ content: |
105106
Your upper terminal window is now listening to the ``CDCTutorial.Source`` Kafka
106107
topic. Changes to your topic will print in this terminal window.
107108
109+
To learn more about ``kafkacat``, see the :github:`kcat repository <edenhill/kcat>` on GitHub.
110+
108111
---
109112
title: Configure Sink Connector
110113
ref: cdc-tutorial-configure-sink
@@ -174,6 +177,8 @@ content: |
174177
175178
rs0 [primary] test>
176179
180+
.. _cdc-tutorial-change-data-insert:
181+
177182
Insert a document into the ``Source`` collection of the ``CDCTutorial`` database
178183
with the following commands:
179184
@@ -250,17 +255,25 @@ content: |
250255
251256
exit
252257
253-
Try and explore the CDC handler on your own. Here are some challenges to get
258+
Explore the sample pipeline on your own. Here are some challenges to get
254259
you started:
255260
256-
- Add a second source connector that writes to the ``CDCTutorial.Source``
257-
topic. Use a pipeline to have this connector only write insert events.
258-
- Remove the ``change.data.capture.handler`` from your sink connector. What
259-
do your documents look like?
260-
- Use ``kafkacat`` to upload a message to the ``CDCTutorial.Source`` topic
261-
that isn't a MongoDB Change Event document. What happens?
262-
263-
<TODO: Link to relevant docs sections from the explore list above >
261+
- Add a new source connector that writes to the ``CDCTutorial.Source``
262+
topic. Configure your new connector to write insert events. To
263+
learn how to filter event types in your connector, see the
264+
:ref:`<source-usage-example-custom-pipeline>` guide.
265+
- Add a new source connector configured with the ``publish.full.document.only=true``
266+
option that writes to the ``CDCTutorial.Source`` topic. Publish a document with
267+
your new source connector. This produces an error in your sink connector and your
268+
sink connector stops. Configure your sink connector to write errant messages
269+
to a topic rather than stop. To learn how to write errant messages to a topic,
270+
see :ref:`<kafka-sink-errors-dlq>`.
271+
- Remove the ``change.data.capture.handler`` from your sink connector.
272+
Add the :ref:`source connector from the tutorial <cdc-tutorial-source-connector>` to {+kc+}
273+
if its not already added. Insert a document into MongoDB
274+
:ref:`as done in the tutorial <cdc-tutorial-change-data-insert>`.
275+
Look at the :ref:`change event document <source-connector-fundamentals-change-event>`
276+
your sink connector inserts into MongoDB.
264277
265278
---
266279
title: Stop the Pipeline

source/introduction/connect.txt

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,5 @@
1+
.. _kafka-intro-connect:
2+
13
==================
24
Connect to MongoDB
35
==================

source/introduction/converters.txt

Lines changed: 2 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -22,8 +22,6 @@ Converters are programs that translate between bytes and
2222
Converters pass data between {+kc+} and Apache Kafka. The {+mkc+} passes data
2323
between MongoDB and {+kc+}. The following diagram shows these relationships:
2424

25-
<TODO: Design Look At Diagram>
26-
2725
.. figure:: /includes/figures/converters.png
2826
:alt: Diagram illustrating converters' role in Kafka Connect
2927

@@ -45,8 +43,6 @@ format, the {+mkc+} works with all available converters.
4543
For example, if your source connector writes to a topic using Protobuf, your
4644
sink connector must use Protobuf to read from the topic.
4745

48-
< TODO: Link "topic" to our glossary page entry on "topics" >
49-
5046
To learn what converter to use, `see this page from Confluent <https://docs.confluent.io/platform/current/schema-registry/connect.html>`__.
5147

5248
Converters with Schemas
@@ -55,16 +51,12 @@ Converters with Schemas
5551
If you use a schema-based converter such as the converter for Avro, Protobuf, or
5652
JSON Schema, you should define a schema in your {+mkc+} source connector.
5753

58-
To learn more, see our
59-
:doc:`guide on applying a schema in a {+mkc+} source connector </source-connector/fundamentals>`.
60-
61-
.. TODO: Link to correct page once available
54+
To learn how to specify a schema, see the
55+
:ref:`<kafka-source-apply-schemas>` guide.
6256

6357
Connector Configuration
6458
-----------------------
6559

66-
<TODO: Move this content into a reference Section as per https://jira.mongodb.org/browse/DOCSP-18435>
67-
6860
This section provides templates for properties files to configure the following
6961
converters in a {+mkc+} pipeline:
7062

source/introduction/data-formats.txt

Lines changed: 13 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -59,8 +59,9 @@ Raw JSON is a data format that consists of JSON objects written as strings. You
5959

6060
"{\"company\":\"MongoDB\"}"
6161

62-
You use Raw JSON when you specify a String Converter on a
63-
source or sink connector. <TODO: Link to Converters page when it is ready>
62+
You use Raw JSON when you specify a String converter on a
63+
source or sink connector. To view connector configurations that specify a
64+
String converter, see the :ref:`Converters <string-converter-sample-properties>` guide.
6465

6566
.. _kafka-df-bson:
6667

@@ -115,7 +116,10 @@ with JSON Schema like this:
115116
}
116117

117118
You use JSON Schema when you apply JSON Schema converters to your connectors.
118-
<TODO: Link to this section of converters page>
119+
To view connector configurations that specify a
120+
JSON Schema converter, see the :ref:`Converters <json-schema-converter-sample-properties>`
121+
guide.
122+
119123

120124
For more information, see the official
121125
`JSON Schema website <https://json-schema.org/>`__.
@@ -200,8 +204,10 @@ like this:
200204

201205
\x0eMongoDB
202206

203-
You use Avro binary encoding when you specify an Avro Converter on a
204-
source or sink connector. <TODO: Link to Converters page when its ready>.
207+
You use Avro binary encoding when you specify an Avro converter on a
208+
source or sink connector. To view connector configurations that specify an
209+
Avro converter, see the :ref:`Converters <avro-converter-sample-properties>`
210+
guide.
205211

206212
To learn more about Avro binary encoding, see
207213
`this section of the Avro specification <https://avro.apache.org/docs/current/spec.html#Data+Serialization+and+Deserialization>`__.
@@ -217,5 +223,5 @@ You can represent the sample document as a byte array using any of the encodings
217223
mentioned above.
218224

219225
You use byte arrays when your converters send data to or receive data
220-
from {+ak+}. For more information on converters, see our guide on converters.
221-
<TODO: link to converters page>
226+
from {+ak+}. For more information on converters, see the
227+
:ref:`<intro-converters>` guide.

source/introduction/data-formats/avro-schema.txt

Lines changed: 0 additions & 122 deletions
This file was deleted.

0 commit comments

Comments
 (0)