best use of scanner via cli
hi,
i found it very inconvenient to write --sope or -- audit or any thing else while define a profile on cli. what would be the easy method that we can mention once and keep on writing what scanner should do ?
like if i have to scan sub domains , page limit and directory limit and similarly more options .
Comments are currently closed for this discussion. You can start a new one.
Keyboard shortcuts
Generic
? | Show this help |
---|---|
ESC | Blurs the current field |
Comment Form
r | Focus the comment reply box |
---|---|
^ + ↩ | Submit the comment |
You can use Command ⌘
instead of Control ^
on Mac
Support Staff 1 Posted by Tasos Laskos on 10 May, 2016 06:03 AM
Hello,
You can use the profile options to store and load your configuration.
Cheers
Tasos Laskos closed this discussion on 10 May, 2016 06:03 AM.
naveesharma re-opened this discussion on 10 May, 2016 06:15 AM
2 Posted by naveesharma on 10 May, 2016 06:15 AM
does that mean we can create profile in web interface and download and use the same on cli ?
but is it is not working can you please explain how we can use it.
Support Staff 3 Posted by Tasos Laskos on 10 May, 2016 06:19 AM
No, you run a CLI scan with the options you want and specify a location where to store the current configuration with
--profile-save-filepath
.Then, instead of having to specify all these options again for subsequent scans, you just need to load that profile with
--profile-load-filepath
.The URL will not be stored btw.
Cheers
Support Staff 4 Posted by Tasos Laskos on 10 May, 2016 06:24 AM
Forgot to mention, you can also do what you mentioned initially, WebUI profiles are exportable for CLI use indeed, but not strictly necessary, you can create profiles from the CLI alone too.
5 Posted by naveesharma on 10 May, 2016 06:37 AM
i tried arachni night but it wasn't fixed the issue of generating report if aborted from cli.
i would also want to know about the below mentioned execution. can we make it more easy or shall i define the entire scope like this with other options as well ?
./arachni http://xyz.com --scope-execlude-binary --scope-auto-redundant 5 --scope-dom-depth-limit 4
Support Staff 6 Posted by Tasos Laskos on 10 May, 2016 06:42 AM
I'll need some more information on the report issue you mentioned, what are you seeing?
About the options, I'm not prepared to rename them, these are the names and I can't change that.
Cheers
Support Staff 7 Posted by Tasos Laskos on 10 May, 2016 06:42 AM
Also, please create an account on this portal, the spam filter keeps blocking your messages.
8 Posted by naveesharma on 10 May, 2016 06:43 AM
typo error aborting from web not from cli.
Support Staff 9 Posted by Tasos Laskos on 10 May, 2016 06:45 AM
If this is about the issue in the other discussion we should discuss it there.
Also, can you try doing the same scan from the CLI? That'll provide more feedback and help me understand what's going on.
Cheers
10 Posted by naveesharma on 11 May, 2016 05:24 AM
hi Tasos..
arachni scanner is very slow. i have specify only 10 page limit and it is taking ages to complete that. any work around to make it fast.
Support Staff 11 Posted by Tasos Laskos on 11 May, 2016 05:32 AM
It depends, are the server response times very high or do the web pages have high JS
setTimeout()
calls?This guide is a good start: http://support.arachni-scanner.com/kb/general-use/optimizing-for-fa...
12 Posted by naveesharma on 11 May, 2016 06:30 AM
this is what i am running and it is keep on running from 1 hr and seems never ending loop. can you suggest what is wrong in the below scan.
Support Staff 13 Posted by Tasos Laskos on 11 May, 2016 06:35 AM
I can't know what's going on just by the configuration, although enabling header audits will result in substantially longer scans; and the increased DOM depth limit will have a similar effect.
If you could send me an e-mail with the real target so that I can perform an identical scan and see what happens then I may be able to help.
Cheers
14 Posted by naveesharma on 11 May, 2016 06:52 AM
tasos,
i aborted the scan which was running from yesterday and was only for 100 pages. if you check the attached screenshot it says that audit pages are 10071. why is it so ? when i have set the max. audit page limit to 100.
Support Staff 15 Posted by Tasos Laskos on 11 May, 2016 07:03 AM
That message is a little misleading, I should update it.
What happens is that one page can have multiple DOM snapshots, so out of the 100 pages 10,000 snapshots were generated.
This large amount is probably due to the increased DOM depth limit you configured.
From the screenshot I can also see that you've significantly reduced the max request concurrency, which will make the scan much slower.
In addition, the system still had to reduce the concurrency even further because the server was responding very slowly (almost 2 seconds) and that's a sign of stress.
Basically, the server is very slow and there were a lot of things that had to be audited, so the scan took a very long time.
16 Posted by naveesharma on 11 May, 2016 07:15 AM
now o while generating report in html it gives me error of "[ NoMethodError] undefined method 'authentication_type=' for #<Arachni::optionGroups::HTTP:0*000 Did you mean? authentication_password.
Support Staff 17 Posted by Tasos Laskos on 11 May, 2016 07:16 AM
Can you please show me the entire error?
18 Posted by naveesharma on 11 May, 2016 07:20 AM
check this out.
Support Staff 19 Posted by Tasos Laskos on 11 May, 2016 07:21 AM
You passed an AFR report that was generated with the latest nightlies to an older version, right?
That won't work, you should use the same version that generated the report.
20 Posted by naveesharma on 11 May, 2016 07:25 AM
i used the same but it was asking me to copy the report at /usr/share/arachni/bin to generate report.When i copied the same it gives me the same error.
21 Posted by naveesharma on 11 May, 2016 07:51 AM
it work after copying the bin file at the same location
Support Staff 22 Posted by Tasos Laskos on 11 May, 2016 07:57 AM
It wouldn't have asked you to copy anything, it may have said that the report file wasn't found at the location you specified.
And copying the executables in that way could result in unexpected behavior.
23 Posted by naveesharma on 11 May, 2016 08:12 AM
then what should i do know. shall i remove bin from that location and do something else ?
because copying the report at that location is not solving the error.
Support Staff 24 Posted by Tasos Laskos on 11 May, 2016 08:44 AM
I'm not sure how your env looks now so to be safe remove all Arachni packages and re-extract the nightlies.
Then pass the location of the report to the
arachni_reporter
executable.25 Posted by naveesharma on 11 May, 2016 09:57 AM
i am running the scan on one local application and specify to audit only 50 pages . also as per our previous discussion i have decreased the dom depth "--scope-dom-depth-limit 2" still it has not finished the scan yet. it looks like never ending process.
Support Staff 26 Posted by Tasos Laskos on 11 May, 2016 10:01 AM
You may also want to set the
--scope-dom-event-limit
option, it's new and allows limiting the amount of events to be triggered for each page. It'll help keep the page snapshots low.However, I can still see that the server is responding very slowly.
A healthy localhost/intranet server would return something like 150 requests per second, you're only getting 11.
27 Posted by naveesharma on 11 May, 2016 10:27 AM
no --scope-dom-event-limit option
Support Staff 28 Posted by Tasos Laskos on 11 May, 2016 10:28 AM
It's there, second from the bottom.
29 Posted by naveesharma on 11 May, 2016 10:35 AM
;-) missed it
what should be the limit.
Support Staff 30 Posted by Tasos Laskos on 11 May, 2016 10:36 AM
Since you're worried about performance I'd say something low like 10, this will result in less coverage though.
The problem is the server, the webapp needs to be audited but if the server is slow the scan can't complete fast.