|
| 1 | +[[dataframe-limitations]] |
| 2 | +== {dataframe-cap} limitations |
| 3 | + |
| 4 | +beta[] |
| 5 | + |
| 6 | +The following limitations and known problems apply to the 7.2 release of |
| 7 | +the Elastic {dataframe} feature: |
| 8 | + |
| 9 | +[float] |
| 10 | +[[df-datatype-limitations]] |
| 11 | +=== {dataframe-cap} data type limitation |
| 12 | + |
| 13 | +{dataframes-cap} do not (yet) support fields containing arrays – in the UI or |
| 14 | +the API. If you try to create one, the UI will fail to show the source index table. |
| 15 | + |
| 16 | +[float] |
| 17 | +[[df-ccs-limitations]] |
| 18 | +=== {ccs-cap} limitation |
| 19 | + |
| 20 | +{ccs-cap} is not supported in 7.2 for {dataframe-transforms}. |
| 21 | + |
| 22 | +[float] |
| 23 | +[[df-kibana-limitations]] |
| 24 | +=== {kib} only displays up to 100 {dataframe-transforms} |
| 25 | + |
| 26 | +The {kib} *Machine Learning* > *Data Frames* page lists up to 100 |
| 27 | +{dataframe-transforms}. You can work-around this limitation by calling the |
| 28 | +{ref}/get-data-frame-transform.html[GET {dataframe-transforms} API] |
| 29 | +with the `size` parameter. |
| 30 | + |
| 31 | +[float] |
| 32 | +[[df-dateformat-limitations]] |
| 33 | +=== Date histogram limitation |
| 34 | + |
| 35 | +If you use a {ref}/search-aggregations-bucket-datehistogram-aggregation.html[date |
| 36 | +histogram] in the `group_by` object in the create or preview {dataframe-transform} |
| 37 | +APIs, the defined interval and time format must have the same time fidelity. |
| 38 | +Otherwise, it might cause issues in the {dataframe}. |
| 39 | + |
| 40 | +For example, if you set the `calendar_interval` of the date histogram to one minute |
| 41 | +(`1m`), then make sure that the `format` is `yyyy-MM-dd HH:mm` instead of |
| 42 | +`yyyy-MM-dd HH:00`. |
| 43 | + |
| 44 | +[float] |
| 45 | +=== Date format limitation in {dataframe-transform} destination index |
| 46 | + |
| 47 | +When you create a {dataframe-transform} that uses a `date_histogram` as a `group-by` |
| 48 | +and set the `interval` to `1y`, the date could be interpreted incorrectly |
| 49 | +in the generated date field of the destination index. The reason is that the `yyyy` |
| 50 | +value can be identified incorrectly as `epoch_millis`. As a workaround, using the |
| 51 | +API, you may define a custom destination index data format mapping prior to starting |
| 52 | +the {dataframe-transform}. For example: |
| 53 | + |
| 54 | +[source, json] |
| 55 | +------------------------------------------------------------ |
| 56 | +"mappings" : { |
| 57 | + "properties" : { |
| 58 | + "custom_date" : { |
| 59 | + "type" : "date", |
| 60 | + "format": "yyyy" |
| 61 | + } |
| 62 | + } |
| 63 | + } |
| 64 | +------------------------------------------------------------ |
| 65 | + |
| 66 | +[float] |
| 67 | +[[df-aggresponse-limitations]] |
| 68 | +=== Aggregation responses may be incompatible with destination index mappings |
| 69 | + |
| 70 | +{dataframes-cap} use composite aggregations to transform data. In some cases, |
| 71 | +composite aggregations may return responses which are not compatible with the |
| 72 | +mappings set for the destination index. For example "NaN", "Infinity" or possibly |
| 73 | +a numeric overflow. Where possible, a null response has been substituted. Please, |
| 74 | +check {es} logs if you think this may have occurred. As a workaround, |
| 75 | +using the API, you may define custom destination index mappings prior to starting |
| 76 | +the {dataframe-transform}. |
0 commit comments