Crawl URLs contained in robots.txt
I've seen that Arachni has a very good coverage and it checks for a robots.txt file. It could be interesting if it crawled also URLs contained in the robots.txt file.
Keyboard shortcuts
Generic
? | Show this help |
---|---|
ESC | Blurs the current field |
Comment Form
r | Focus the comment reply box |
---|---|
^ + ↩ | Submit the comment |
You can use Command ⌘
instead of Control ^
on Mac