How to specify the urls that I want to scan?

Shang Wang's Avatar

Shang Wang

07 May, 2013 05:44 PM

Dear Tasos:

I'm currently using arachni to examine a web application that has only around 15 pages. However, a lot of them can only be visited by triggering javascript events. As far as I know, arachni is not yet able to process javascript, so I'm wondering if I can specify those links manually in one scan(because otherwise I can run 15 scans, but that doesn't help very much). Can you point to me both command line and api documentations that explains this? Or some small examples should suffice. Thanks.

  1. Support Staff 1 Posted by Tasos Laskos on 07 May, 2013 05:50 PM

    Tasos Laskos's Avatar

    Sure thing:

    Feel free to re-open the discussion if you need further clarifications.

  2. Tasos Laskos closed this discussion on 07 May, 2013 05:50 PM.

  3. Shang Wang re-opened this discussion on 07 May, 2013 08:09 PM

  4. 2 Posted by Shang Wang on 07 May, 2013 08:09 PM

    Shang Wang's Avatar

    Dear Tasos:

    Thanks for the response. I created a file, then put another url in it, then pass the file name along with --extend_paths parameter, is this the right method? I couldn't find a test site that has the same situation as my client's(links using javascript), and since today is not a good day for testing, I'm not sure how to verify it.

    Another concern is that I cannot hard-code any link in our product(uses arachni as engine), I need to write something to dynamically detect all javascript links. My question is: if the link is out of scope(not the same domain as original target), will --extend_paths parameter also add this domain as target, or it simply exclude it? Thanks.

  5. Support Staff 3 Posted by Tasos Laskos on 07 May, 2013 08:17 PM

    Tasos Laskos's Avatar

    No, the CLI argument is --extend-paths, the API option is extend_paths.

    You could use an external crawler that can interpret JS and then pass that sitemap to Arachni -- or better yet, fire up the proxy plugin and configure the external crawler to use Arachni's proxy, that way AJAX requests will be seen and analyzed by Arachni as well.

    And no, if a path matches an exclusion criterion it will be ignored.


  6. Tasos Laskos closed this discussion on 07 May, 2013 08:17 PM.

Comments are currently closed for this discussion. You can start a new one.

Keyboard shortcuts


? Show this help
ESC Blurs the current field

Comment Form

r Focus the comment reply box
^ + ↩ Submit the comment

You can use Command ⌘ instead of Control ^ on Mac