Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider failing SEO category when page is blocked from indexing #14832

Closed
patrickhulce opened this issue Feb 27, 2023 · 2 comments · Fixed by #15933
Closed

Consider failing SEO category when page is blocked from indexing #14832

patrickhulce opened this issue Feb 27, 2023 · 2 comments · Fixed by #15933
Assignees
Milestone

Comments

@patrickhulce
Copy link
Collaborator

Summary
SEO category bears many similarities to PWA category in that a numeric weighted average score does not accurately reflect the result. e.g. if you've done all the optimization in the world but your robots.txt accidentally blocks the page from being discovered, you really should have a score of 0.

Ideas:

  • Just increase the weight of is-crawlable to something very high (dilutes other results though)
  • Bespoke logic to give category 0 if is-crawlable fails.
  • Something fancier ala old PWA category passing criteria.

Context: https://twitter.com/filrakowski/status/1630246287202484224

@connorjclark
Copy link
Collaborator

connorjclark commented Mar 4, 2023

Perhaps a necessary: boolean on AuditRef.

@paulirish
Copy link
Member

after discussion: We'll increase the weighting of is-crawlable to ensure it's worth at least 31% of the seo score.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants