Skip to content

Commit

Permalink
fix(specs): additional safetyChecks (#4128)
Browse files Browse the repository at this point in the history
Co-authored-by: Gary Conroy <gary.conroy@LON-M3P-GConroy.fritz.box>
Co-authored-by: Clément Vannicatte <vannicattec@gmail.com>
  • Loading branch information
3 people authored Nov 18, 2024
1 parent 1f04992 commit 212cae6
Showing 1 changed file with 7 additions and 2 deletions.
9 changes: 7 additions & 2 deletions specs/crawler/common/schemas/configuration.yml
Original file line number Diff line number Diff line change
Expand Up @@ -444,14 +444,14 @@ extraParameters:
safetyChecks:
type: object
description: Safety checks for ensuring data integrity between crawls.
description: Checks to ensure the crawl was successful.
properties:
beforeIndexPublishing:
$ref: '#/beforeIndexPublishing'

beforeIndexPublishing:
type: object
description: Checks triggered after the crawl finishes and before the records are added to the Algolia index.
description: These checks are triggered after the crawl finishes but before the records are added to the Algolia index.
properties:
maxLostRecordsPercentage:
type: number
Expand All @@ -464,6 +464,11 @@ beforeIndexPublishing:
minimum: 1
maximum: 100
default: 10
maxFailedUrls:
type: number
description: |
Stops the crawler if a specified number of pages fail to crawl.
If undefined, the crawler won't stop if it encounters such errors.
schedule:
type: string
Expand Down

0 comments on commit 212cae6

Please sign in to comment.