Debug Arachni crawler

Ralf's Avatar

Ralf

08 Nov, 2019 06:06 PM

I am trying to audit one of our Angular cloud apps. Unfortunately Arachni has some difficulties with it (or I do probably in using it right). I created a Loginscript and it also states to Login successfully, showing a token as expected, and the session check seems to work fine. Still it does not crawl the page. The only 4 pages it finds are the page loaded initially and some static ressources. Strangely at some point I had a bit more (still not everything) during my tests but I don't think I changed something since then. I tried to play around with the --browser-cluster-wait-for-element option to wait for specific elements (is this correct to listen to all pages?: --browser-cluster-wait-for-element=".*:.someClass"), but still not finding any more links...
Even more strange, if I route arachni to the main page without parameter "https://sitename/" it usually should redirect to "https://sitename/apps" but it only indexes the one without rerouting, looks like the JS from angular is not even executed at all which does the routing client side.

Is there any way to debug this? Like spawn a browser window making it visible or at least get some kind of browser console output and/or server responses?
With --output-debug=3 I don't get any relevant information.
I would be glad for help.
Thanks in advance!

Reply to this discussion

Internal reply

Formatting help / Preview (switch to plain text) No formatting (switch to Markdown)

Attaching KB article:

»

Attached Files

You can attach files up to 10MB

If you don't have an account yet, we need to confirm you're human and not a machine trying to post spam.

Keyboard shortcuts

Generic

? Show this help
ESC Blurs the current field

Comment Form

r Focus the comment reply box
^ + ↩ Submit the comment

You can use Command ⌘ instead of Control ^ on Mac