@@ -82,9 +82,9 @@ def transform(input_feature_group, input_csv):
82
82
inputs (Sequence[Union[FeatureGroupDataSource, CSVDataSource, ParquetDataSource,
83
83
BaseDataSource]]): A list of data sources.
84
84
output (str): A Feature Group ARN to write results of this function to.
85
- target_stores (Optional[list[str]], optional): A list containing at least one of
86
- 'OnlineStore' or 'OfflineStore'. If unspecified, data will be ingested to the enabled
87
- stores of the output feature group. Defaults to None.
85
+ target_stores (Optional[list[str]], optional): A list containing at least one
86
+ of 'OnlineStore' or 'OfflineStore'. If unspecified, data will be ingested to the
87
+ enabled stores of the output feature group. Defaults to None.
88
88
parameters (Optional[Dict[str, Union[str, Dict]]], optional): Parameters to be provided to
89
89
the decorated function, available as the 'params' argument. Useful for parameterized
90
90
functions. The params argument also contains the set of system provided parameters
@@ -96,6 +96,7 @@ def transform(input_feature_group, input_csv):
96
96
development phase to ensure that data is not used until the function is ready. It also
97
97
useful for users that want to manage their own data ingestion. Defaults to True.
98
98
spark_config (Dict[str, str]): A dict contains the key-value paris for Spark configurations.
99
+
99
100
Raises:
100
101
IngestionError: If any rows are not ingested successfully then a sample of the records,
101
102
with failure reasons, is logged.
0 commit comments