The robotsvalidator script allows you to check if URLs are allowed or disallowed by a robots.txt file.
- Getting robots.txt file from local file
- Getting robots.txt file from an URL
- Verbose mode, showing all the rules with their results.
There is a verbose mode using --debug
option, which prints every rule with its result:
Pull requests are welcome. Feel free to open an issue if you want to add other features.