robots.txt Testing & Validation Tool

[Jump to tool]

Checks a list of URLs against the live, or a custom, robots.txt file one to see if they are allowed, or blocked and if so, by what rule.

If you want to test changes before you publish your robots.txt file, untick the Use Live robots.txt? tick box and try changes to your robots.txt.

Uses the Google Robots.txt Parser and Matcher Library, which matches the one used in production at Google.

Parsing and matching is one part of the picture, sometimes a search engine or other service might choose to fall back or ignore certain rules, like Googlebot-Image falling back to Googlebot rules if no specific User-agent rule is found. This tool attempts to mimic the behaviour here for Google and Applebot.

New! Get the Chrome Extension

Get the Chrome Extension to quickly access this tool quicker with a simple right-click. Grab it from the Chrome Store

Chrome Extension Icon

Using: Live robots.txt | User-agent: Googlebot

One per line, should start with http / https

Settings

 

URL is required

Custom robots.txt will apply to all URLs, regardless of origin example:

URL Live robots.txt Custom robots.txt
https://example.com/test https://example.com/robots.txt the custom robots.txt
https://www.differentexample.com/some/test/product.html https://differentexample.com/robots.txt the custom robots.txt

Brought to you in conjunction with Jamie Indigo ( @Jammer_Volts ), of Not a Robot

Find This Useful?

You can help support the cost of these free tools by sponsoring me on github:

Or buy me a coffee at Kofi

Buy Me a Coffee at ko-fi.com