You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Do I have to include an allow rule to allow crawling? No, you do not need to include an allow rule. All URLs are implicitly allowed and the allow rule is used to override disallow rules in the same robots.txt file.
Hi,
The docs mention that You must add a robots.txt file to allow search engines to crawl all your application pages.
Why is that?
A robots.txt file allowing everything seems to be unnecessary:
Do I have to include an allow rule to allow crawling?
No, you do not need to include an allow rule. All URLs are implicitly allowed and the allow rule is used to override disallow rules in the same robots.txt file.
Also:
Thank you!
The text was updated successfully, but these errors were encountered: