Limiting arachni scan to 1 url
Hi, I've been successfully scanning our site with arachni, testing for vulnerabilities and fixing then rescanning. It's working but it takes quite a while for a full scan. Is there a way I can focus the scan on a single page without crawling the whole site? That way I can fix that script and move to the next. I have tried an include with just the url of the page I wanted to scan but arachni still seems to crawl the whole site. If more information is needed please let me know and I will do my best to answer any questions if it helps. Thanks for looking.
Comments are currently closed for this discussion. You can start a new one.
Keyboard shortcuts
Generic
? | Show this help |
---|---|
ESC | Blurs the current field |
Comment Form
r | Focus the comment reply box |
---|---|
^ + ↩ | Submit the comment |
You can use Command ⌘
instead of Control ^
on Mac
Support Staff 1 Posted by Tasos Laskos on 25 Sep, 2012 03:24 PM
Have you tried using
--link-count=1
?2 Posted by Lee Michels on 25 Sep, 2012 03:34 PM
I must apologize, I found an open discussion in Google groups right after I sent my question dealing with the same thing and it outlined the --link-count=1 it also included limiting the modules by using -m xss_* so I modified that for what I was trying to do. This is the command I used to launch arachni:
./arachni -v -g --link-count=1 -m sqli_* --report=html:outfile=2012-09-25_getAddress_001.html
It appears to work as the scan comes back much faster but it does not create the report. I am using a variation of the command I have been using to do the whole site so I know the report part works. I am wondering if the -m sqli_* right before the report parameter is causing an issue?
As always than you for the great tool and taking the time to respond!
Support Staff 3 Posted by Tasos Laskos on 25 Sep, 2012 03:40 PM
I changed the wildcards a bit since then because matching was flawed, try
sqli*
.Also, what do you mean it doesn't generate a report? At all or it doesn't include any vulns?
If it doesn't create anything at all then there's a serious problem somewhere, if the report doesn't include any logged issues then it's because of the wildcard matching approach I mentioned.
Support Staff 4 Posted by Tasos Laskos on 25 Sep, 2012 03:49 PM
Also, since we're on (or near) the subject, you can use the
rescan
plug-in to save yourself the trouble of crawling everytime. You just pass it the first AFR report (which resulted from a full crawl) and it'll use its sitemap instead of crawling.Doesn't strictly apply to your question but I'm guessing that you'll want to perform a last full sweep at the end of your bugfixing, so it could be useful then. ;)
5 Posted by Lee Michels on 25 Sep, 2012 03:59 PM
Looks like my last response didn't make it. There is no report at all. There is an .afr file but no html report.
I will try the
sqli*
change and see what happens. I had actually tried to modify that portion on my own and am now in the middle of what appears to be a full scan so it will be a bit before I can try.Thanks for the tip on using
rescan
that will save a TON of time!Support Staff 6 Posted by Tasos Laskos on 25 Sep, 2012 04:02 PM
Which version are you using? Could you try using the nightlies?
They're stable now and will be released as v0.4 in a few days.
7 Posted by Lee Michels on 25 Sep, 2012 04:07 PM
I am using
v0.4.1dev
but I think I downloaded it in early September so I will grab a new one and give that a go.8 Posted by Lee Michels on 26 Sep, 2012 01:33 PM
Ok, I grabbed a new nightly and rearranged my command to launch the scan and it looks like it is working on just one url now and the report is being generated. The report is blank but I am hoping that the reason is that I have taken care of the vulnerability :)
Thank you so much for your help and your great software!
Support Staff 9 Posted by Tasos Laskos on 26 Sep, 2012 01:43 PM
Good, that's good. :)
Let me know if you need further help.
Tasos Laskos closed this discussion on 26 Sep, 2012 01:43 PM.