Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

is-crawlable improvements #14210

Open
rviscomi opened this issue Jul 13, 2022 · 0 comments
Open

is-crawlable improvements #14210

rviscomi opened this issue Jul 13, 2022 · 0 comments
Assignees

Comments

@rviscomi
Copy link
Member

The intention of the is-crawlable audit is to help site owners understand whether they are correctly blocking their page from indexing.

However, the audit doesn't handle an edge case in the Google Search docs:

If the page is blocked by a robots.txt file or the crawler can't access the page, the crawler will never see the noindex directive, and the page can still appear in search results, for example if other pages link to it.

In this case, it'd be more helpful if the Lighthouse audit flagged that the page may actually be indexable due to their blocking crawlers from discovering the noindex directive.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
3 participants