Skip to content

Commit

Permalink
Merge branch 'main' into feat/composition-default-authmode
Browse files Browse the repository at this point in the history
  • Loading branch information
shortcuts authored Nov 18, 2024
2 parents 98eb49e + aae4ddb commit 0e97d38
Show file tree
Hide file tree
Showing 3 changed files with 23 additions and 8 deletions.
11 changes: 8 additions & 3 deletions specs/bundled/crawler.doc.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2881,8 +2881,8 @@ components:
beforeIndexPublishing:
type: object
description: >-
Checks triggered after the crawl finishes and before the records are
added to the Algolia index.
These checks are triggered after the crawl finishes but before the
records are added to the Algolia index.
properties:
maxLostRecordsPercentage:
type: number
Expand All @@ -2900,9 +2900,14 @@ components:
minimum: 1
maximum: 100
default: 10
maxFailedUrls:
type: number
description: |
Stops the crawler if a specified number of pages fail to crawl.
If undefined, the crawler won't stop if it encounters such errors.
safetyChecks:
type: object
description: Safety checks for ensuring data integrity between crawls.
description: Checks to ensure the crawl was successful.
properties:
beforeIndexPublishing:
$ref: '#/components/schemas/beforeIndexPublishing'
Expand Down
11 changes: 8 additions & 3 deletions specs/bundled/crawler.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2881,8 +2881,8 @@ components:
beforeIndexPublishing:
type: object
description: >-
Checks triggered after the crawl finishes and before the records are
added to the Algolia index.
These checks are triggered after the crawl finishes but before the
records are added to the Algolia index.
properties:
maxLostRecordsPercentage:
type: number
Expand All @@ -2900,9 +2900,14 @@ components:
minimum: 1
maximum: 100
default: 10
maxFailedUrls:
type: number
description: |
Stops the crawler if a specified number of pages fail to crawl.
If undefined, the crawler won't stop if it encounters such errors.
safetyChecks:
type: object
description: Safety checks for ensuring data integrity between crawls.
description: Checks to ensure the crawl was successful.
properties:
beforeIndexPublishing:
$ref: '#/components/schemas/beforeIndexPublishing'
Expand Down
9 changes: 7 additions & 2 deletions specs/crawler/common/schemas/configuration.yml
Original file line number Diff line number Diff line change
Expand Up @@ -444,14 +444,14 @@ extraParameters:
safetyChecks:
type: object
description: Safety checks for ensuring data integrity between crawls.
description: Checks to ensure the crawl was successful.
properties:
beforeIndexPublishing:
$ref: '#/beforeIndexPublishing'

beforeIndexPublishing:
type: object
description: Checks triggered after the crawl finishes and before the records are added to the Algolia index.
description: These checks are triggered after the crawl finishes but before the records are added to the Algolia index.
properties:
maxLostRecordsPercentage:
type: number
Expand All @@ -464,6 +464,11 @@ beforeIndexPublishing:
minimum: 1
maximum: 100
default: 10
maxFailedUrls:
type: number
description: |
Stops the crawler if a specified number of pages fail to crawl.
If undefined, the crawler won't stop if it encounters such errors.
schedule:
type: string
Expand Down

0 comments on commit 0e97d38

Please sign in to comment.