Long run arachni task issue

Shang Wang's Avatar

Shang Wang

14 Jan, 2013 06:57 PM

Dear all:

I am running a scan using arachni(version 0.4.0.4) against a very big website, and I encountered something that's confusing, hope somebody can help me.

The scan went on for 2 days(weekend), and since I'm using RPC to monitor the status output periodically, I can see that arachni status shows "crawling", with the amount of request stayed the same and the amount of response increasing. This morning(3rd day of scanning) I checked the log and found out that arachni started auditing, but shortly after that my program keeps return with timeout error, which means RPC call never returns any result. I checked arachni log but no error message, arachni just stopped logging.

These are the only issues I have found, and I'm not sure what's going on with arachni. Initially I thought it hits a loop but eventually the status changed to "auditing", and indeed some vulnerabilities came out before the crash happened. I don't know what else can I provide to help with this issue, so any advice is welcomed. Thanks.

  1. Support Staff 1 Posted by Tasos Laskos on 14 Jan, 2013 10:35 PM

    Tasos Laskos's Avatar

    As the doc says, the RPC output is not at all accurate and is only there to provide some notion of progress.

    What you want is to enable the --reroute-to-logfile option when starting a Dispatcher, this way all the output will be stored under the /logs folder and will be available for inspection.

    Moreover, v0.4.0.4 had a few serious bugs so you'd be better off trying the latest version.

    So, I can't really say what might be wrong with your scan but the above instructions will certainly aid us in debugging it.

  2. 2 Posted by Shang Wang on 16 Jan, 2013 11:23 PM

    Shang Wang's Avatar

    Dear Tasos:

    Thanks for your reply! I also have another question. Is it possible for arachni to narrow down to only portion of the total scope within the target web application? I might provide some proof-of-concept to customers instead of showing them all the vulnerabilities found, so it might not be necessary to scan the whole thing. Thanks.

  3. Support Staff 3 Posted by Tasos Laskos on 16 Jan, 2013 11:31 PM

    Tasos Laskos's Avatar

    Yeah you can use the "include" and "exclude" options or just provide a "link_count_limit" and only scan a certain number of pages instead of the whole thing.

    Is that what you were looking for?

  4. 4 Posted by Shang Wang on 17 Jan, 2013 06:06 PM

    Shang Wang's Avatar

    Dear Tasos:

    Thanks for the response! I tried the command line version of link-count (I assume that it's equal to the "link_count_limit" API doc that you provided):

    arachni "http://demo.testfire.net/default.aspx" --link-count=5

    However, arachni run for 1 hour with no sign of stopping, so I manually kill the scan. Here's the output that I found suspicious:

    [~] Total: 241
     [+] Without issues: 199
     [-] With issues: 42 ( 17% )
    
     [~] 1.84% [=>                                                           ] 100% 
     [~] Est. remaining time: --:--:--
    
     [~] Crawler has discovered 572 pages.
     [~] Audit limited to a max of 5 pages -- excluding 596 pages of Trainer feedback.
    
     [~] Sent 70110 requests.
     [~] Received and analyzed 69735 responses.
     [~] In 00:50:30
     [~] Average: 23 requests/second.
    

    By looking at above message, seems like the --link-count flag is set for auditor instead of for crawler? Here's the quote for --link-count on the website:

    Link count limit (--link-count)
    
    Expects: integer
    Default: infinite
    Multiple invocations?: no
    
    It specifies how many links the crawler should follow.
    

    I'm afraid I missed something important, please help, thanks.

  5. Support Staff 5 Posted by Tasos Laskos on 17 Jan, 2013 06:10 PM

    Tasos Laskos's Avatar

    Looks like the Trainer subsystem kept finding more pages and elements during the audit and fed them back to the framework, effectively overriding the link-count-limit.

    This is a bug and I'll take care of it today, thanks for the feedback.

  6. Support Staff 6 Posted by Tasos Laskos on 18 Jan, 2013 12:11 AM

    Tasos Laskos's Avatar

    Just pushed the fix to this.

  7. 7 Posted by Shang Wang on 22 Jan, 2013 02:57 PM

    Shang Wang's Avatar

    Dear Tasos:

    I fetched the experimental branch from github and the fix works. Thanks a lot. However, I have other issues. After running arachni through command line for 5~6 hours, arachni seems to be so addicted to scanning that when I try to use Ctrl-c + r to generate report, it totally ignores me. I did that several times and arachni finally gives me a blank report. What's worse is that if you run arachni for even longer time, say 24 hours, arachni process got "killed". I'm not sure if it's the operating system that kills it or it's a software crash, but it's more like a system kill because at the last line of the screen shows "Killed". Since this evidence is beyond my knowledge I'd like you to help me and analyse it. Sorry for having so many questions and again thanks for making arachni a better scanner.

  8. Support Staff 8 Posted by Tasos Laskos on 22 Jan, 2013 03:14 PM

    Tasos Laskos's Avatar

    I'm surprised that the experimental package worked for you as it hasn't been updated since the 14th -- having some trouble with my build box, will try to fix it today.

    As for the rest...I have to idea, I'll need to stress test it and see if I can reproduce the issues.

  9. 9 Posted by Shang Wang on 22 Jan, 2013 04:26 PM

    Shang Wang's Avatar

    OK, no more questions for now! Thanks for the help!

  10. Support Staff 10 Posted by Tasos Laskos on 30 Jan, 2013 01:25 AM

    Tasos Laskos's Avatar

    After investigating it looks like the process exceeded a hard ulimit (probably RAM).

    The reason for the kill should be in one of these:

    • /var/log/kern.log
    • /var/log/messages
    • /var/log/syslog

    I'm guessing there must be a memory leak somewhere.

  11. Support Staff 11 Posted by Tasos Laskos on 30 Jan, 2013 05:00 AM

    Tasos Laskos's Avatar

    To keep you updated, good thing is that I know how to decrease memory usage under certain circumstances.

    For example, some recon modules which should have had a cap on the amount of max issues they are allowed log and the data structure which holds signatures for custom 404 detection were both left unchecked.
    Admittedly, that oversight was quite naive and I will take of the issue during the weekend.

    If that's not what's causing the crash then I'll have to look deeper.

  12. 12 Posted by Shang Wang on 30 Jan, 2013 09:45 PM

    Shang Wang's Avatar

    I'm so glad for your help, looking forward to the new progress from you. By the way, where can I get the latest working arachni? Github or arachni website?

  13. Support Staff 13 Posted by Tasos Laskos on 30 Jan, 2013 10:16 PM

    Tasos Laskos's Avatar

    Sort of depends...

    Latest stable version you get from:

    Latest dev/unstable code you get from:

  14. Support Staff 14 Posted by Tasos Laskos on 31 Jan, 2013 10:13 PM

    Tasos Laskos's Avatar

    I've got bad news, good news and great news.

    Bad news is that the issue wasn't caused by the things I initially thought and mentioned.
    Good news is that after optimizing for those issues performance indeed increased, although slightly.
    Great news is that I figured out what the problem was and I just pushed the fix to the experimental branch.

    To maintain stability and ensure some level of reliability for timing-attacks these operations where saved in order to be performed last, and that meant keeping a bunch of blocks in an array.
    Unfortunately, lots of blocks meant lots of scopes to be saved and that lead to significant memory consumption.

    I adjusted the algorithm I use to not require that much memory and that lead to increased reliability/accuracy as well.

    I am still testing it and if need be I can decrease memory consumption even more...still looking for the sweet spot.

  15. Tasos Laskos closed this discussion on 31 Jan, 2013 10:13 PM.

Comments are currently closed for this discussion. You can start a new one.

Keyboard shortcuts

Generic

? Show this help
ESC Blurs the current field

Comment Form

r Focus the comment reply box
^ + ↩ Submit the comment

You can use Command ⌘ instead of Control ^ on Mac