|
| 1 | +.. _source-configuration-output-format: |
| 2 | + |
| 3 | +======================== |
| 4 | +Output Format Properties |
| 5 | +======================== |
| 6 | + |
| 7 | +.. default-domain:: mongodb |
| 8 | + |
| 9 | +.. contents:: On this page |
| 10 | + :local: |
| 11 | + :backlinks: none |
| 12 | + :depth: 2 |
| 13 | + :class: singlecol |
| 14 | + |
| 15 | +Overview |
| 16 | +-------- |
| 17 | + |
| 18 | +Use the following configuration settings to specify the format of data the |
| 19 | +source connector publishes to Kafka topics. |
| 20 | + |
| 21 | +.. include:: /includes/source-config-link.rst |
| 22 | + |
| 23 | +Settings |
| 24 | +-------- |
| 25 | + |
| 26 | +.. list-table:: |
| 27 | + :header-rows: 1 |
| 28 | + :widths: 30 70 |
| 29 | + |
| 30 | + * - Name |
| 31 | + - Description |
| 32 | + |
| 33 | + * - | **output.format.key** |
| 34 | + - | **Type:** string |
| 35 | + | |
| 36 | + | **Description:** |
| 37 | + | Specifies which data format the source connector outputs the key |
| 38 | + document. |
| 39 | + | |
| 40 | + | **Default**: ``json`` |
| 41 | + | **Accepted Values**: ``bson``, ``json``, ``schema`` |
| 42 | + |
| 43 | + * - | **output.format.value** |
| 44 | + - | **Type:** string |
| 45 | + | |
| 46 | + | **Description:** |
| 47 | + | Specifies which data format the source connector outputs the value |
| 48 | + document. |
| 49 | + | |
| 50 | + | **Default**: ``json`` |
| 51 | + | **Accepted Values**: ``bson``, ``json``, ``schema`` |
| 52 | + |
| 53 | + * - | **output.json.formatter** |
| 54 | + - | **Type:** string |
| 55 | + | |
| 56 | + | **Description:** |
| 57 | + | Class name of the JSON formatter the connector should use to |
| 58 | + output data. |
| 59 | + | |
| 60 | + | **Default**: |
| 61 | + |
| 62 | + .. code-block:: |
| 63 | + |
| 64 | + com.mongodb.kafka.connect.source.json.formatter.DefaultJson |
| 65 | + |
| 66 | + | **Accepted Values**: |
| 67 | + | One of the following full class names: |
| 68 | + |
| 69 | + .. code-block:: none |
| 70 | + |
| 71 | + com.mongodb.kafka.connect.source.json.formatter.DefaultJson |
| 72 | + com.mongodb.kafka.connect.source.json.formatter.ExtendedJson |
| 73 | + com.mongodb.kafka.connect.source.json.formatter.SimplifiedJson |
| 74 | + |
| 75 | + | Or your custom JSON formatter full class name. |
| 76 | + |
| 77 | + * - | **output.schema.key** |
| 78 | + - | **Type:** string |
| 79 | + | |
| 80 | + | **Description:** |
| 81 | + | Specifies an AVRO schema definition for the key document of the |
| 82 | + `SourceRecord <https://kafka.apache.org/23/javadoc/org/apache/kafka/connect/source/SourceRecord.html>`__. |
| 83 | + |
| 84 | + .. seealso:: |
| 85 | + |
| 86 | + For more information on AVRO schema, see the Avro schema guide |
| 87 | + (TODO: link Fundamentals > Data Formats > AVRO schema page). |
| 88 | + |
| 89 | + | **Default**: |
| 90 | + |
| 91 | + .. code-block:: json |
| 92 | + |
| 93 | + { |
| 94 | + "type": "record", |
| 95 | + "name": "keySchema", |
| 96 | + "fields" : [ { "name": "_id", "type": "string" } ]" |
| 97 | + } |
| 98 | + |
| 99 | + | **Accepted Values**: A valid AVRO schema |
| 100 | + |
| 101 | + * - | **output.schema.value** |
| 102 | + - | **Type:** string |
| 103 | + | |
| 104 | + | **Description:** |
| 105 | + | Specifies an AVRO schema definition for the value document of the |
| 106 | + `SourceRecord <https://kafka.apache.org/23/javadoc/org/apache/kafka/connect/source/SourceRecord.html>`__. |
| 107 | + |
| 108 | + .. seealso:: |
| 109 | + |
| 110 | + For more information on AVRO schema, see the AVRO schema guide |
| 111 | + (TODO: link Fundamentals > Data Formats > AVRO schema page). |
| 112 | + |
| 113 | + | **Default**: |
| 114 | + |
| 115 | + .. code-block:: json |
| 116 | + |
| 117 | + { |
| 118 | + "name": "ChangeStream", |
| 119 | + "type": "record", |
| 120 | + "fields": [ |
| 121 | + { "name": "_id", "type": "string" }, |
| 122 | + { "name": "operationType", "type": ["string", "null"] }, |
| 123 | + { "name": "fullDocument", "type": ["string", "null"] }, |
| 124 | + { "name": "ns", |
| 125 | + "type": [{"name": "ns", "type": "record", "fields": [ |
| 126 | + {"name": "db", "type": "string"}, |
| 127 | + {"name": "coll", "type": ["string", "null"] } ] |
| 128 | + }, "null" ] }, |
| 129 | + { "name": "to", |
| 130 | + "type": [{"name": "to", "type": "record", "fields": [ |
| 131 | + {"name": "db", "type": "string"}, |
| 132 | + {"name": "coll", "type": ["string", "null"] } ] |
| 133 | + }, "null" ] }, |
| 134 | + { "name": "documentKey", "type": ["string", "null"] }, |
| 135 | + { "name": "updateDescription", |
| 136 | + "type": [{"name": "updateDescription", "type": "record", "fields": [ |
| 137 | + {"name": "updatedFields", "type": ["string", "null"]}, |
| 138 | + {"name": "removedFields", |
| 139 | + "type": [{"type": "array", "items": "string"}, "null"] |
| 140 | + }] }, "null"] }, |
| 141 | + { "name": "clusterTime", "type": ["string", "null"] }, |
| 142 | + { "name": "txnNumber", "type": ["long", "null"]}, |
| 143 | + { "name": "lsid", "type": [{"name": "lsid", "type": "record", |
| 144 | + "fields": [ {"name": "id", "type": "string"}, |
| 145 | + {"name": "uid", "type": "string"}] }, "null"] } |
| 146 | + ] |
| 147 | + } |
| 148 | + |
| 149 | + | **Accepted Values**: A valid JSON schema |
| 150 | + |
| 151 | + * - | **output.schema.infer.value** |
| 152 | + - | **Type:** boolean |
| 153 | + | |
| 154 | + | **Description:** |
| 155 | + | Whether the connector should infer the schema for the value |
| 156 | + document of the `SourceRecord <https://kafka.apache.org/23/javadoc/org/apache/kafka/connect/source/SourceRecord.html>`__. |
| 157 | + Since the connector processes each document in isolation, the |
| 158 | + connector may generate many schemas. |
| 159 | + |
| 160 | + .. important:: |
| 161 | + |
| 162 | + The connector only reads this setting when you set your |
| 163 | + ``output.format.value`` setting to ``schema``. |
| 164 | + |
| 165 | + | **Default**: ``false`` |
| 166 | + | **Accepted Values**: ``true`` or ``false`` |
| 167 | + |
0 commit comments