Does arachni support crawling and scanning of single page applications

Yasser's Avatar


16 Dec, 2015 04:48 PM


I have an intranet application that is designed using SPA. When i scan ./arachni http://app/index.html, it couldnt crawl to the dynamic links created by Javascript. I tried the proxy and it seems the only way to train and extract the URLs to be subjected to scanning. Although I am fine with the proxy-way that needs manual intervention, I was wondering if there is a hidden option where arachni is smart enough to launch an internal browser, execute javascript, crawl and do everything automatically. That would be magical.

  1. Support Staff 1 Posted by Tasos Laskos on 16 Dec, 2015 04:53 PM

    Tasos Laskos's Avatar


    Arachni does do that (default is 6 browser workers, happens in the background) and is usually pretty good at it.
    It could be a configuration issue or a bug somewhere.

    Any chance I can get access to the application to see what's really going on?
    Also, which version are you using?


  2. 2 Posted by Yasser on 16 Dec, 2015 06:45 PM

    Yasser's Avatar

    Thanks for the reply.
    Unfortunately, it is an intranet application so I wouldn't be able to give access.

    It uses Jquery UI and RequireJs as the primary Javascript frameworks not the commonly used Angular or REact. Not sure if there is a known bug using those frameworks.

    Is there a way I could turn on logging of arachni's internal and investigate errors?

  3. 3 Posted by Yasser on 16 Dec, 2015 06:46 PM

    Yasser's Avatar

    BTW, I am using the latest released version.

  4. Support Staff 4 Posted by Tasos Laskos on 16 Dec, 2015 07:07 PM

    Tasos Laskos's Avatar

    Nope, there aren't any known bugs.
    Can you try the nightlies please and see if that makes any difference?

    If you get the same results you can enable the debugging output with --output-debug=2.
    It may also be a good idea to not load any checks and just do a crawl in order to reduce output noise and focus on the relevant browser stuff: --checks -


  5. 5 Posted by Yasser on 16 Dec, 2015 08:18 PM

    Yasser's Avatar

    I downloaded the nightly arachni-2.0dev-1.0dev-linux-x86_64.tar.gz and tried -
    ./arachni http://sitehost/context/index.html --output-debug=2 --checks - --http-cookie-string='user=123'

    Opening the report shows a list of 32 URLs in sitemap with no issues reported.
    No errors came out in the log.
    Not sure what's stopping the crawler to not click on the JS generated links.

    I know its difficult to look into this without the actual application, but if you think i can try something else, please let me know.
    Appreciate your response.

  6. Support Staff 6 Posted by Tasos Laskos on 16 Dec, 2015 08:20 PM

    Tasos Laskos's Avatar

    Can I see the output please?

    Also, would it be possible to put a test case together?
    Take the simplest form of the problem and create a tiny SPA that reproduces it and I'll work with that.

  7. 7 Posted by Yasser on 16 Dec, 2015 08:31 PM

    Yasser's Avatar

    Here is the output attached.
    I cannot share existing code, but I will write an SPA from scratch using the same JS frameworks and try to replicate it. I will send the code soon.

  8. Support Staff 8 Posted by Tasos Laskos on 16 Dec, 2015 08:33 PM

    Tasos Laskos's Avatar

    You missed this bit:

     [-] Retrying for: https://sitehost/ [Couldn't connect to server]
     [-] Retrying for: https://sitehost/ [Couldn't connect to server]
     [-] Retrying for: https://sitehost/ [Couldn't connect to server]
     [-] Retrying for: https://sitehost/ [Couldn't connect to server]
     [-] [framework/parts/data#pop_page_from_url_queue:147] Giving up trying to audit: https://sitehost/
     [-] [framework/parts/data#pop_page_from_url_queue:148] Couldn't get a response after 5 tries: Couldn't connect to server.

    Looks like Arachni got the boot after a while, a firewall or IDS perhaps?.

  9. 9 Posted by Yasser on 16 Dec, 2015 08:57 PM

    Yasser's Avatar

    The target site is on HTTP. Not sure why arachni would try to request on HTTPS.
    I initiated the command with http only - ./arachni http://sitehost/context/index.html
    I thought it does an HTTPS check by default.

  10. Support Staff 10 Posted by Tasos Laskos on 16 Dec, 2015 09:04 PM

    Tasos Laskos's Avatar

    My bad, something must have led Arachni to that.
    I'll be waiting for the test case then.


  11. Support Staff 11 Posted by Tasos Laskos on 25 Jan, 2016 10:56 PM

    Tasos Laskos's Avatar

    Btw, I think you may be affected by the lack of this feature:

    Luckily, I'm putting the finishing touches on it right now and you can stay updated by following the discussion on GitHub.


  12. Tasos Laskos closed this discussion on 25 Jan, 2016 10:56 PM.

Comments are currently closed for this discussion. You can start a new one.

Keyboard shortcuts


? Show this help
ESC Blurs the current field

Comment Form

r Focus the comment reply box
^ + ↩ Submit the comment

You can use Command ⌘ instead of Control ^ on Mac