You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The intention of the is-crawlable audit is to help site owners understand whether they are correctly blocking their page from indexing.
However, the audit doesn't handle an edge case in the Google Search docs:
If the page is blocked by a robots.txt file or the crawler can't access the page, the crawler will never see the noindex directive, and the page can still appear in search results, for example if other pages link to it.
In this case, it'd be more helpful if the Lighthouse audit flagged that the page may actually be indexable due to their blocking crawlers from discovering the noindex directive.
The text was updated successfully, but these errors were encountered:
The intention of the
is-crawlable
audit is to help site owners understand whether they are correctly blocking their page from indexing.However, the audit doesn't handle an edge case in the Google Search docs:
In this case, it'd be more helpful if the Lighthouse audit flagged that the page may actually be indexable due to their blocking crawlers from discovering the
noindex
directive.The text was updated successfully, but these errors were encountered: