Scan takes too long time
Hi,
I am experimenting Arachni to evaluate whether it meets our requirement. I'm using 1GB RAM, 1 core CPU virtual machine with bWAPP. Unfortunately, it takes too long (2-3 days) time to scan bWAPP. I used optimization guides mentioned in the article titled "Optimizing fast scan", but result was same. How can I make it faster? I ran Acunetix and Buprsuite on the same machines against bWAPP. They scanned super fast (at most 30 minutes) compared to Arachni. So I'm wondering. Following is the command I used to crawl:
./arachni_multi --instance-spawns=2 http://192.168.217.129/bWAPP/aim.php --checks trainer --audit-links --audit-forms --audit-cookies --report-save-path=crawl_report.afr --platforms=linux,php,apache,mysql --browser-cluster-ignore-images
Comments are currently closed for this discussion. You can start a new one.
Keyboard shortcuts
Generic
? | Show this help |
---|---|
ESC | Blurs the current field |
Comment Form
r | Focus the comment reply box |
---|---|
^ + ↩ | Submit the comment |
You can use Command ⌘
instead of Control ^
on Mac
Support Staff 1 Posted by Tasos Laskos on 19 Feb, 2015 02:51 AM
My best guess is that you're killing the machine. The VM is quite underpowered so it's highly likely that the web server gets stressed and takes too long to respond, and using 3 instances to perform the scan makes things much worse.
Also, Arachni has a much higher request time-out setting than most scanners, so if the web server does take a long time to respond Arachni will wait instead of giving up early.
I'm just guessing though, I downloaded the bee-box appliance to test this out, I'll let you know what I find.
Cheers
2 Posted by Aggie on 19 Feb, 2015 03:14 AM
Wow, fastest reply I've ever seen :). Thanks man. I did it only one instance as well. It was long too, so I tried with arachni_multi.
Support Staff 3 Posted by Tasos Laskos on 19 Feb, 2015 03:21 AM
Running a scan now with the same settings as the ones you mentioned, getting average performance.
One issue I see is the way bWAPP does navigation, via forms with drop-down inputs that have many available values, and the actions of those forms change to the current URL. This, in essence, creates a boatload of needless workload that needs to be processed in order to get decent coverage, don't know if the other scanners are that thorough.
Ah and the server just died, as I initially suspected. So, now rescanning with a much lower request concurrency to avoid stressing it too much, will keep you posted.
Cheers
Support Staff 4 Posted by Tasos Laskos on 19 Feb, 2015 03:49 AM
Also, I just found out that there's a link to the root directory somewhere so Arachni ends up scanning not only bWAPP but the rest of the included applications.
Support Staff 5 Posted by Tasos Laskos on 19 Feb, 2015 05:41 AM
OK, you can retry with these extra options:
That'll prevent the server from being stressed by dramatically lowering the request concurrency, it'll force the scope to just bWAPP and in case the server gets stressed Arachni will wait for it for only 5 second instead of the usual 50.
The scan finished in 22mins with the above configuration.
Cheers
6 Posted by Aggie on 19 Feb, 2015 11:48 AM
Thanks, will try with this and post the result.
Support Staff 7 Posted by Tasos Laskos on 19 Feb, 2015 04:49 PM
No problem.
A full scan with all checks still completely killed the server eventually though, you better increase the VM's resources. 1 request at a time is the slowest Arachni can go.
Cheers
Tasos Laskos closed this discussion on 25 Feb, 2015 04:29 AM.