robots.txt Testing Tool

Checks a list of URLs against a robots.txt file to see if they are allowed, or blocked and if so, by what rule.

Uses the Google Robots.txt Parser and Matcher Library, which matches the one used in production at Google.

Parsing and matching is one part of the picture, sometimes a search engine or other service might choose to fall back or ignore certain rules, like Googlebot-Image falling back to Googlebot rules if no specific User-agent rule is found. This tool attempts to mimic the behaviour here for Google and Applebot.


Using: | User-agent:

One per line, should start with http / https


Idea for this tool by Jamie Indigo ( @Jammer_Volts ), of Not a Robot

Find This Useful?

You can help support the cost of these free tools by sponsoring me on github:

Or buy me a coffee at Kofi

Buy Me a Coffee at ko-fi.com