Skip to content

Commit 82bee58

Browse files
authored
DOCSP-19649: Audited Kafka connector names. (#91)
1 parent faf5456 commit 82bee58

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

43 files changed

+84
-84
lines changed

source/contribute.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ Gradle checks. You can run the checks with the following command:
3939
.. note:: Skipped Tests
4040

4141
You can skip tests in the ``integrationTest`` task related to
42-
the following areas unless your code specifically modifies {+connector+} behavior
42+
the following areas unless your code specifically modifies connector behavior
4343
related to these areas:
4444

4545
- Specific versions of MongoDB
@@ -54,7 +54,7 @@ Gradle checks. You can run the checks with the following command:
5454
You can run tests related to a specific MongoDB version by deploying a local replica set
5555
with that version of MongoDB.
5656

57-
To learn more about the {+connector+} source code, see the :github:`GitHub repository <mongodb/mongo-kafka>`.
57+
To learn more about the connector source code, see the :github:`GitHub repository <mongodb/mongo-kafka>`.
5858

5959
To learn more about Gradle, see the official
6060
`Gradle website <https://docs.gradle.org/>`__.
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
A {+source-connector+} works by opening a single change stream with
1+
The source connector works by opening a single change stream with
22
MongoDB and sending data from that change stream to {+kafka-connect+}. Your source
33
connector maintains its change stream for the duration of its runtime, and your
44
connector closes its change stream when you stop it.

source/introduction/connect.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@ to interact with MongoDB.
6060
Version {+connector_version+} of the {+connector+} uses version
6161
{+connector_driver_version+} of the MongoDB Java driver.
6262

63-
To learn what connection URI options are available in the {+connector+}, see
63+
To learn what connection URI options are available in the connector, see
6464
`the MongoDB Java driver Connection guide <{+connector_driver_url_base+}fundamentals/connection/#connection-options>`__.
6565

6666
Authentication
@@ -86,5 +86,5 @@ The following is an example of a connection URI that authenticates with MongoDB
8686
To learn what authentication mechanisms are available, see
8787
`the MongoDB Java driver Authentication Mechanisms guide <{+connector_driver_url_base+}fundamentals/auth/#mechanisms>`__.
8888

89-
To learn more about authentication in the {+connector+}, see the
89+
To learn more about authentication in the connector, see the
9090
:doc:`Security and Authentication guide </security-and-authentication>`.

source/introduction/converters.txt

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ This guide describes how to use **converters** with the {+connector+}.
1919
Converters are programs that translate between bytes and
2020
{+kafka-connect+}'s runtime data format.
2121

22-
Converters pass data between {+kafka-connect+} and Apache Kafka. The {+connector+} passes data
22+
Converters pass data between {+kafka-connect+} and Apache Kafka. The connector passes data
2323
between MongoDB and {+kafka-connect+}. The following diagram shows these relationships:
2424

2525
.. figure:: /includes/figures/converters.png
@@ -34,12 +34,12 @@ To learn more about converters, see the following resources:
3434
Available Converters
3535
--------------------
3636

37-
As the {+connector+} converts your MongoDB data into {+kafka-connect+}'s runtime data
38-
format, the {+connector+} works with all available converters.
37+
As the connector converts your MongoDB data into {+kafka-connect+}'s runtime data
38+
format, the connector works with all available converters.
3939

4040
.. important:: Use the Same Converter for your Source and Sink Connectors
4141

42-
You must use the same converter in your source and sink connectors.
42+
You must use the same converter in your {+source-connector+} and {+sink-connector+}.
4343
For example, if your source connector writes to a topic using Protobuf, your
4444
sink connector must use Protobuf to read from the topic.
4545

@@ -49,7 +49,7 @@ Converters with Schemas
4949
~~~~~~~~~~~~~~~~~~~~~~~
5050

5151
If you use a schema-based converter such as the converter for Avro, Protobuf, or
52-
JSON Schema, you should define a schema in your {+connector+} source connector.
52+
JSON Schema, you should define a schema in your source connector.
5353

5454
To learn how to specify a schema, see the
5555
:ref:`<kafka-source-apply-schemas>` guide.
@@ -58,7 +58,7 @@ Connector Configuration
5858
-----------------------
5959

6060
This section provides templates for properties files to configure the following
61-
converters in a {+connector+} pipeline:
61+
converters in a connector pipeline:
6262

6363
- :ref:`Avro Converter <avro-converter-sample-properties>`
6464
- :ref:`Protobuf Converter <protobuf-converter-sample-properties>`
@@ -274,7 +274,7 @@ Click the following tabs to view properties files that work with the String conv
274274

275275
.. important:: Received Strings Must be Valid JSON
276276

277-
Your {+connector+} sink connector must receive valid JSON strings from your
277+
Your sink connector must receive valid JSON strings from your
278278
{+kafka+} topic even when using a String converter.
279279

280280
To use the preceding properties file, replace the placeholder text in angle

source/introduction/data-formats.txt

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ represent the :ref:`sample document <kafka-df-sample-doc>` in JSON like this:
3737

3838
{"company":"MongoDB"}
3939

40-
You may encounter the following data formats related to JSON when working with the {+connector+}:
40+
You may encounter the following data formats related to JSON when working with the connector:
4141

4242
- :ref:`Raw JSON <kafka-df-raw-json>`
4343
- :ref:`BSON <kafka-df-bson>`
@@ -130,7 +130,7 @@ Avro
130130
----
131131

132132
Apache Avro is an open-source framework for serializing and transporting
133-
data described by schemas. Avro defines two data formats relevant to the {+connector+}:
133+
data described by schemas. Avro defines two data formats relevant to the connector:
134134

135135
- :ref:`Avro schema <kafka-df-avro-schema>`
136136
- :ref:`Avro binary encoding <kafka-df-avro-encoding>`
@@ -152,7 +152,7 @@ specification of the following groups of data types:
152152

153153
.. warning:: Unsupported Avro Types
154154

155-
{+connector+} does not support the following Avro types:
155+
The connector does not support the following Avro types:
156156

157157
- ``enum`` types. Use ``string`` instead.
158158
- ``fixed`` types. Use ``bytes`` instead.
@@ -162,8 +162,8 @@ specification of the following groups of data types:
162162

163163
.. important:: Sink Connectors and Logical Types
164164

165-
{+connector+} sink connectors support all Avro schema primitive and complex types,
166-
however {+connector+} sink connectors support only the following logical types:
165+
The {+sink-connector+} supports all Avro schema primitive and complex types,
166+
however sink connectors support only the following logical types:
167167

168168
- ``decimal``
169169
- ``date``
@@ -191,7 +191,7 @@ like this:
191191
}
192192

193193
You use Avro schema when you
194-
:ref:`define a schema for a {+connector+} source connector <source-specify-avro-schema>`.
194+
:ref:`define a schema for a {+source-connector+} <source-specify-avro-schema>`.
195195

196196
For a list of all Avro schema types, see the
197197
`Apache Avro specification <https://avro.apache.org/docs/current/spec.html>`__.

source/introduction/install.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ Install the MongoDB Kafka Connector
1515
Overview
1616
--------
1717

18-
Learn how to install the {+connector+}. The {+connector+} is available for Confluent Platform and
18+
Learn how to install the {+connector+}. The connector is available for Confluent Platform and
1919
{+kafka+} deployments. To see installation instructions for your deployment type,
2020
navigate to one of the following sections:
2121

@@ -84,7 +84,7 @@ Install the Connector on Apache Kafka
8484
Download a Connector JAR File
8585
-----------------------------
8686

87-
You can download the {+connector+} source and JAR files from the following locations:
87+
You can download the connector source and JAR files from the following locations:
8888

8989
.. _kafka-connector-installation-reference:
9090

source/issues-and-help.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ Bugs / Feature Requests
1313
-----------------------
1414

1515
If you think you've found a bug or want to see a new feature in the
16-
Kafka Connector, please open a case in our issue management tool, JIRA:
16+
{+connector+}, please open a case in our issue management tool, JIRA:
1717

1818
* `Create an account and login <https://jira.mongodb.org>`__.
1919
* Navigate to `the KAFKA project <https://jira.mongodb.org/browse/KAFKA>`__.

source/migrate-from-kafka-connect-mongodb.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ to the :github:`official MongoDB Kafka connector <mongodb/mongo-kafka>`.
1212

1313
The following sections list the changes you must make to your Kafka
1414
Connect sink connector configuration settings and custom classes to transition
15-
to the MongoDB Kafka connector.
15+
to the {+sink-connector+}.
1616

1717
Update Configuration Settings
1818
-----------------------------

source/monitoring.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,8 +15,8 @@ Monitoring
1515
Overview
1616
--------
1717

18-
Learn how to observe the behavior of your MongoDB source or sink
19-
connector through **monitoring**.
18+
Learn how to observe the behavior of your {+source-connector+} or
19+
{+sink-connector+} through **monitoring**.
2020
Monitoring is the process of getting information about the
2121
activities a running program performs for use in an application
2222
or an application performance management library.

source/security-and-authentication/tls-and-x509.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@ Store Certificates on the Worker
5151
--------------------------------
5252

5353
Store your certificates in a **keystore** and **truststore** to secure
54-
your certificate credentials for each server you run your {+connector+} worker
54+
your certificate credentials for each server you run your connector worker
5555
instance on.
5656

5757
Keystore
@@ -105,7 +105,7 @@ testing purposes, see
105105
Add Credentials to the Connector
106106
--------------------------------
107107

108-
The {+connector+} worker processes JVM options from your ``KAFKA_OPTS``
108+
The connector worker processes JVM options from your ``KAFKA_OPTS``
109109
environment variable. The environment variable contains the path and
110110
password to your keystore and truststore.
111111

source/sink-connector.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ Overview
2323
--------
2424

2525
This section focuses on the **{+sink-connector+}**.
26-
The {+sink-connector+} is a {+kafka-connect+} connector that reads data from {+kafka+} and
26+
The sink connector is a {+kafka-connect+} connector that reads data from {+kafka+} and
2727
writes data to MongoDB.
2828

2929
Configuration Properties

source/sink-connector/configuration-properties.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ Sink Connector Configuration Properties
1515
Overview
1616
--------
1717

18-
In this section, you can read descriptions of sink connector properties,
18+
In this section, you can read descriptions of the {+sink-connector+} properties,
1919
including essential {+kafka-connect-long+} settings and {+connector+}-specific
2020
settings.
2121

source/sink-connector/configuration-properties/cdc.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -17,8 +17,8 @@ Overview
1717

1818
.. _sink-configuration-change-data-capture-description-start:
1919

20-
Use the following configuration settings to specify a class the sink
21-
connector uses to process change data capture (CDC) events.
20+
Use the following configuration settings to specify a class the {+sink-connector+}
21+
uses to process change data capture (CDC) events.
2222

2323
See the guide on :doc:`Sink Connector Change Data Capture </sink-connector/fundamentals/change-data-capture>`
2424
for examples using the built-in ``ChangeStreamHandler`` and handlers for the

source/sink-connector/configuration-properties/connector-message.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ Overview
1818
.. _sink-configuration-message-processing-description-start:
1919

2020
Use the settings on this page to configure the message processing behavior of
21-
the sink connector including the following:
21+
the {+sink-connector+} including the following:
2222

2323
- Message batch size
2424
- Rate limits

source/sink-connector/configuration-properties/error-handling.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ Overview
1717

1818
.. _sink-configuration-error-handling-description-start:
1919

20-
Use the following configuration settings to specify how the sink connector
20+
Use the following configuration settings to specify how the {+sink-connector+}
2121
handles errors and to configure the dead letter queue.
2222

2323
.. _sink-configuration-error-handling-description-end:

source/sink-connector/configuration-properties/id-strategy.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ Overview
1717

1818
.. _sink-configuration-id-strategy-description-start:
1919

20-
Use the following configuration settings to specify how the sink connector
20+
Use the following configuration settings to specify how the {+sink-connector+}
2121
should determine the ``_id`` value for each document it writes to MongoDB.
2222

2323
.. _sink-configuration-id-strategy-description-end:

source/sink-connector/configuration-properties/kafka-topic.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ Overview
1818
.. _sink-configuration-topic-properties-description-start:
1919

2020
Use the following configuration settings to specify which Kafka topics the
21-
sink connector should watch for data.
21+
{+sink-connector+} should watch for data.
2222

2323
.. _sink-configuration-topic-properties-description-end:
2424

source/sink-connector/configuration-properties/mongodb-connection.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -17,8 +17,8 @@ Overview
1717

1818
.. _sink-configuration-mongodb-connection-description-start:
1919

20-
Use the following configuration settings to specify how your sink
21-
connector connects and communicates with your MongoDB cluster.
20+
Use the following configuration settings to specify how your {+sink-connector+}
21+
connects and communicates with your MongoDB cluster.
2222

2323
.. _sink-configuration-mongodb-connection-description-end:
2424

source/sink-connector/configuration-properties/mongodb-namespace.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ Overview
1818
.. _sink-configuration-namespace-mapping-description-start:
1919

2020
Use the following configuration settings to specify which MongoDB database
21-
and collection that your sink connector writes data to. You can use the
21+
and collection that your {+sink-connector+} writes data to. You can use the
2222
default ``DefaultNamespaceMapper`` or specify a custom class.
2323

2424
.. _sink-configuration-namespace-mapping-description-end:

source/sink-connector/configuration-properties/post-processors.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ Overview
1717

1818
.. _sink-configuration-post-processors-description-start:
1919

20-
Use the following configuration settings to specify how the sink connector
20+
Use the following configuration settings to specify how the {+sink-connector+}
2121
should transform Kafka data before inserting it into MongoDB.
2222

2323
.. _sink-configuration-post-processors-description-end:

source/sink-connector/configuration-properties/time-series.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ Overview
1717

1818
.. _sink-configuration-time-series-description-start:
1919

20-
Use the following configuration settings to specify how the connector
20+
Use the following configuration settings to specify how the {+sink-connector+}
2121
should sink data to a MongoDB time series collection.
2222

2323
.. _sink-configuration-time-series-description-end:

source/sink-connector/configuration-properties/topic-override.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ Overview
1717

1818
.. _sink-configuration-topic-override-description-start:
1919

20-
Use the following sink connector configuration settings to override global or
20+
Use the following {+sink-connector+} configuration settings to override global or
2121
default property settings for specific topics.
2222

2323
.. _sink-configuration-topic-override-description-end:

source/sink-connector/configuration-properties/write-strategies.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ Overview
1717

1818
.. _sink-configuration-write-model-strategy-description-start:
1919

20-
Use the strategies in the following table to specify how the sink connector
20+
Use the strategies in the following table to specify how the {+sink-connector+}
2121
writes data into MongoDB. You can specify a write strategy with the following
2222
configuration:
2323

source/sink-connector/fundamentals/change-data-capture.txt

Lines changed: 4 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -15,8 +15,8 @@ Change Data Capture Handlers
1515
Overview
1616
--------
1717

18-
Learn how to **replicate** your **change data capture (CDC)** events with a {+connector+} sink
19-
connector. CDC is a software architecture that converts changes in a datastore
18+
Learn how to **replicate** your **change data capture (CDC)** events with a
19+
{+sink-connector+}. CDC is a software architecture that converts changes in a datastore
2020
into a stream of **CDC events**. A CDC event is a message containing a
2121
reproducible representation of a change performed on a datastore. Replicating
2222
data is the process of applying the changes contained in CDC events from one data
@@ -55,15 +55,14 @@ You can specify a CDC handler on your sink connector with the following configur
5555
change.data.capture.handler=<cdc handler class>
5656

5757
To learn more, see
58-
:doc:`change data capture configuration options </sink-connector/configuration-properties/cdc>`
59-
in the {+connector+}.
58+
:doc:`change data capture configuration options </sink-connector/configuration-properties/cdc>`.
6059

6160
.. _available-cdc-handlers:
6261

6362
Available CDC Handlers
6463
~~~~~~~~~~~~~~~~~~~~~~
6564

66-
The {+connector+} provides CDC handlers for the following CDC event producers:
65+
The sink connector provides CDC handlers for the following CDC event producers:
6766

6867
- MongoDB
6968
- `Debezium <https://debezium.io/>`__

source/sink-connector/fundamentals/error-handling-strategies.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ Error Handling
1515
Overview
1616
--------
1717

18-
In this guide, you can learn how to handle errors in your sink connector.
18+
In this guide, you can learn how to handle errors in your {+sink-connector+}.
1919
The following list shows some common scenarios that cause your sink
2020
connector to experience an error:
2121

@@ -173,7 +173,7 @@ For more information, see Confluent's guide on
173173
Handle Errors at the Connector Level
174174
------------------------------------
175175

176-
The {+connector+} provides options that allow you to configure error
176+
The sink connector provides options that allow you to configure error
177177
handling at the connector level. The options are as follows:
178178

179179
.. list-table::

0 commit comments

Comments
 (0)