Memory management with large forms

Mike's Avatar


07 Nov, 2013 06:07 PM

I am auditing a site that has a form with somewhere around 2,000 inputs on it (not my design) and Arachni is bailing out due to a lack of memory available. The VM Arachni is running in is not out of memory when this error occurs.

I can provide more details if you'd like and I have attached the stacktrace that is thrown by the rpcd.


  1. Support Staff 1 Posted by Tasos Laskos on 07 Nov, 2013 09:29 PM

    Tasos Laskos's Avatar

    Hey Mike,

    The stacktrace shows that the memory ran out when the issue data were being copied prior to a processing operation. That could be because the issues were too many for the machine to hold in memory or because there was no memory left because it used it all to generate form mutations (the latter sounds more reasonable).

    How much RAM does the machine have?

  2. 2 Posted by Mike on 08 Nov, 2013 07:23 PM

    Mike's Avatar

    It is a VMWare VM and has 4GB available... The RAM utilization never gets above 3GB however during the audit.

  3. Support Staff 3 Posted by Tasos Laskos on 08 Nov, 2013 07:27 PM

    Tasos Laskos's Avatar

    3GB RAM usage is pretty unacceptable anyways no matter the reason.

    I skipped your case while I was performing memory optimizations recently because I figured that no-one would run into something like that but clearly you're affected.

    I'm half-done with the fix, will let you know once its ready.


  4. Support Staff 4 Posted by Tasos Laskos on 08 Nov, 2013 11:07 PM

    Tasos Laskos's Avatar

    I'm now running the full test suite to make sure the fix didn't introduce any regressions, so I've got some time to explain the issue in some detail.

    What was going on in the system was that it needed to generate fuzzing mutations for that element and as it had 2000 inputs there were an astounding amount of permutations to be generated and held in memory.

    What I've done now is update the audit to use pairs of generate-consume operations, instead of generating mutations in bulk first (and thus requiring a lot of memory and introducing latency to the audit) and consuming later. This way only the absolutely minimum and necessary amount of elements will remain in memory and the audit moves along much faster.

    However, because the form is so big and a few permutations of it will have to remain in memory, the system will still require a significant amount of memory. Not as much as before mind you, but still plentiful (half a GB, possibly...hopefully).

    Also, I've added a new option (--http-queue-size) which lets you control the maximum amount of requests to be stored in the HTTP queue before performing a run. More requests means better I/O scheduling and thus better performance, fewer requests means less memory (that's because each element remains in memory until its associated response has arrived and been processed).

    After the test suite has finished I'll see if I can optimize this further and give you some hints about fine-tuning the above option in order to keep RAM consumption at bay.


  5. 5 Posted by Mike on 08 Nov, 2013 11:38 PM

    Mike's Avatar

    Great thanks for working on this...

    By the way I made another discussion that I think got caught in your spam filter regarding the AutoLogin plugin... I went ahead and registered for an account finally so maybe my posts won't be flagged so often.

    Let me know how it goes

    Thanks again!

  6. Support Staff 6 Posted by Tasos Laskos on 09 Nov, 2013 12:20 AM

    Tasos Laskos's Avatar

    No worries.

    The spam filter does seem to have a few issues with you, sorry about that.

  7. Support Staff 7 Posted by Tasos Laskos on 10 Nov, 2013 03:34 AM

    Tasos Laskos's Avatar

    Problem sort of solved, although I don't see what else I can do.

    experimental branch (code can be found in the nightlies):

    • --http-queue-size=20: RAM around 110MB.
    • Default (--http-queue-size=500): RAM around 500MB.

    master branch: RAM 910MB after mutation generation, just before the audit starts and increasing as the audit goes on.

    So you can massively improve RAM consumption by lowering the default value of http-queue-size but the audit will still be pretty slow due to the design of the webapp. A form with 2000 inputs means a lot of HTML code to be analyzed by Arachni and a lot of audit workload, but it also probably means a big workload for the server as well.

    Overall, this is a bad situation, be prepared for a massive hit in performance.

    I won't close this issue until I hear how it works for you so please do let me know.


  8. Support Staff 8 Posted by Tasos Laskos on 11 Nov, 2013 03:55 PM

    Tasos Laskos's Avatar

    Btw, you'll need to disable the sqli_blind_rdiff module because it really needs to generate the mutations prior to performing the analysis, no way to get around that.


  9. 9 Posted by Mike on 11 Nov, 2013 07:24 PM

    Mike's Avatar

    Arachni is no longer crashing when I set the http-queue-size to 100. The RAM utilization for ruby is somewhere around 1GB with the limit on as opposed to the 3 or so without it.

    I agree there is nothing the scanner can do to improve the bad design of the Web application so this fix works for me.


  10. Support Staff 10 Posted by Tasos Laskos on 11 Nov, 2013 07:32 PM

    Tasos Laskos's Avatar

    No worries man, glad you got it working.


  11. Tasos Laskos closed this discussion on 11 Nov, 2013 07:32 PM.

Comments are currently closed for this discussion. You can start a new one.

Keyboard shortcuts


? Show this help
ESC Blurs the current field

Comment Form

r Focus the comment reply box
^ + ↩ Submit the comment

You can use Command ⌘ instead of Control ^ on Mac