Skip to content

Commit

Permalink
fix(specs): additional safetyChecks (#4128) (generated) [skip ci]
Browse files Browse the repository at this point in the history
Co-authored-by: gazconroy <gazconroyster@gmail.com>
Co-authored-by: Gary Conroy <gary.conroy@LON-M3P-GConroy.fritz.box>
Co-authored-by: Clément Vannicatte <vannicattec@gmail.com>
  • Loading branch information
4 people committed Nov 18, 2024
1 parent 212cae6 commit aae4ddb
Show file tree
Hide file tree
Showing 2 changed files with 16 additions and 6 deletions.
11 changes: 8 additions & 3 deletions specs/bundled/crawler.doc.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2881,8 +2881,8 @@ components:
beforeIndexPublishing:
type: object
description: >-
Checks triggered after the crawl finishes and before the records are
added to the Algolia index.
These checks are triggered after the crawl finishes but before the
records are added to the Algolia index.
properties:
maxLostRecordsPercentage:
type: number
Expand All @@ -2900,9 +2900,14 @@ components:
minimum: 1
maximum: 100
default: 10
maxFailedUrls:
type: number
description: |
Stops the crawler if a specified number of pages fail to crawl.
If undefined, the crawler won't stop if it encounters such errors.
safetyChecks:
type: object
description: Safety checks for ensuring data integrity between crawls.
description: Checks to ensure the crawl was successful.
properties:
beforeIndexPublishing:
$ref: '#/components/schemas/beforeIndexPublishing'
Expand Down
11 changes: 8 additions & 3 deletions specs/bundled/crawler.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2881,8 +2881,8 @@ components:
beforeIndexPublishing:
type: object
description: >-
Checks triggered after the crawl finishes and before the records are
added to the Algolia index.
These checks are triggered after the crawl finishes but before the
records are added to the Algolia index.
properties:
maxLostRecordsPercentage:
type: number
Expand All @@ -2900,9 +2900,14 @@ components:
minimum: 1
maximum: 100
default: 10
maxFailedUrls:
type: number
description: |
Stops the crawler if a specified number of pages fail to crawl.
If undefined, the crawler won't stop if it encounters such errors.
safetyChecks:
type: object
description: Safety checks for ensuring data integrity between crawls.
description: Checks to ensure the crawl was successful.
properties:
beforeIndexPublishing:
$ref: '#/components/schemas/beforeIndexPublishing'
Expand Down

0 comments on commit aae4ddb

Please sign in to comment.