Build a valid robots.txt in seconds. Control which bots crawl your site, protect admin areas, block bad crawlers, and add your sitemap URL.
Disallow: / rule.
RankAl SEO generates and updates your robots.txt directly from WordPress admin — no FTP needed.
Specifies which crawler the following rules apply to. Use * for all bots, or a specific name like Googlebot.
Blocks a bot from crawling the specified path. An empty value means "allow everything". A single / blocks the entire site.
Explicitly permits crawling of a path — useful when a parent is disallowed but you want a subfolder crawled.
Points crawlers to your sitemap. This is not agent-specific and should be placed at the end of the file. Use the full absolute URL.