File tree Expand file tree Collapse file tree 3 files changed +23
-8
lines changed Expand file tree Collapse file tree 3 files changed +23
-8
lines changed Original file line number Diff line number Diff line change @@ -2881,8 +2881,8 @@ components:
2881
2881
beforeIndexPublishing :
2882
2882
type : object
2883
2883
description : >-
2884
- Checks triggered after the crawl finishes and before the records are
2885
- added to the Algolia index.
2884
+ These checks are triggered after the crawl finishes but before the
2885
+ records are added to the Algolia index.
2886
2886
properties :
2887
2887
maxLostRecordsPercentage :
2888
2888
type : number
@@ -2900,9 +2900,14 @@ components:
2900
2900
minimum : 1
2901
2901
maximum : 100
2902
2902
default : 10
2903
+ maxFailedUrls :
2904
+ type : number
2905
+ description : |
2906
+ Stops the crawler if a specified number of pages fail to crawl.
2907
+ If undefined, the crawler won't stop if it encounters such errors.
2903
2908
safetyChecks :
2904
2909
type : object
2905
- description : Safety checks for ensuring data integrity between crawls .
2910
+ description : Checks to ensure the crawl was successful .
2906
2911
properties :
2907
2912
beforeIndexPublishing :
2908
2913
$ref : ' #/components/schemas/beforeIndexPublishing'
Original file line number Diff line number Diff line change @@ -2881,8 +2881,8 @@ components:
2881
2881
beforeIndexPublishing :
2882
2882
type : object
2883
2883
description : >-
2884
- Checks triggered after the crawl finishes and before the records are
2885
- added to the Algolia index.
2884
+ These checks are triggered after the crawl finishes but before the
2885
+ records are added to the Algolia index.
2886
2886
properties :
2887
2887
maxLostRecordsPercentage :
2888
2888
type : number
@@ -2900,9 +2900,14 @@ components:
2900
2900
minimum : 1
2901
2901
maximum : 100
2902
2902
default : 10
2903
+ maxFailedUrls :
2904
+ type : number
2905
+ description : |
2906
+ Stops the crawler if a specified number of pages fail to crawl.
2907
+ If undefined, the crawler won't stop if it encounters such errors.
2903
2908
safetyChecks :
2904
2909
type : object
2905
- description : Safety checks for ensuring data integrity between crawls .
2910
+ description : Checks to ensure the crawl was successful .
2906
2911
properties :
2907
2912
beforeIndexPublishing :
2908
2913
$ref : ' #/components/schemas/beforeIndexPublishing'
Original file line number Diff line number Diff line change @@ -444,14 +444,14 @@ extraParameters:
444
444
445
445
safetyChecks :
446
446
type : object
447
- description : Safety checks for ensuring data integrity between crawls .
447
+ description : Checks to ensure the crawl was successful .
448
448
properties :
449
449
beforeIndexPublishing :
450
450
$ref : ' #/beforeIndexPublishing'
451
451
452
452
beforeIndexPublishing :
453
453
type : object
454
- description : Checks triggered after the crawl finishes and before the records are added to the Algolia index.
454
+ description : These checks are triggered after the crawl finishes but before the records are added to the Algolia index.
455
455
properties :
456
456
maxLostRecordsPercentage :
457
457
type : number
@@ -464,6 +464,11 @@ beforeIndexPublishing:
464
464
minimum : 1
465
465
maximum : 100
466
466
default : 10
467
+ maxFailedUrls :
468
+ type : number
469
+ description : |
470
+ Stops the crawler if a specified number of pages fail to crawl.
471
+ If undefined, the crawler won't stop if it encounters such errors.
467
472
468
473
schedule :
469
474
type : string
You can’t perform that action at this time.
0 commit comments