best use of scanner via cli

naveesharma's Avatar

naveesharma

10 May, 2016 05:32 AM

hi,

i found it very inconvenient to write --sope or -- audit or any thing else while define a profile on cli. what would be the easy method that we can mention once and keep on writing what scanner should do ?

like if i have to scan sub domains , page limit and directory limit and similarly more options .

  1. Support Staff 1 Posted by Tasos Laskos on 10 May, 2016 06:03 AM

    Tasos Laskos's Avatar

    Hello,

    You can use the profile options to store and load your configuration.

    Cheers

  2. Tasos Laskos closed this discussion on 10 May, 2016 06:03 AM.

  3. naveesharma re-opened this discussion on 10 May, 2016 06:15 AM

  4. 2 Posted by naveesharma on 10 May, 2016 06:15 AM

    naveesharma's Avatar

    does that mean we can create profile in web interface and download and use the same on cli ?

    but is it is not working can you please explain how we can use it.

  5. Support Staff 3 Posted by Tasos Laskos on 10 May, 2016 06:19 AM

    Tasos Laskos's Avatar

    No, you run a CLI scan with the options you want and specify a location where to store the current configuration with --profile-save-filepath.
    Then, instead of having to specify all these options again for subsequent scans, you just need to load that profile with --profile-load-filepath.
    The URL will not be stored btw.

    Cheers

  6. Support Staff 4 Posted by Tasos Laskos on 10 May, 2016 06:24 AM

    Tasos Laskos's Avatar

    Forgot to mention, you can also do what you mentioned initially, WebUI profiles are exportable for CLI use indeed, but not strictly necessary, you can create profiles from the CLI alone too.

  7. 5 Posted by naveesharma on 10 May, 2016 06:37 AM

    naveesharma's Avatar

    i tried arachni night but it wasn't fixed the issue of generating report if aborted from cli.

    i would also want to know about the below mentioned execution. can we make it more easy or shall i define the entire scope like this with other options as well ?

    ./arachni http://xyz.com --scope-execlude-binary --scope-auto-redundant 5 --scope-dom-depth-limit 4

  8. Support Staff 6 Posted by Tasos Laskos on 10 May, 2016 06:42 AM

    Tasos Laskos's Avatar

    I'll need some more information on the report issue you mentioned, what are you seeing?
    About the options, I'm not prepared to rename them, these are the names and I can't change that.

    Cheers

  9. Support Staff 7 Posted by Tasos Laskos on 10 May, 2016 06:42 AM

    Tasos Laskos's Avatar

    Also, please create an account on this portal, the spam filter keeps blocking your messages.

  10. 8 Posted by naveesharma on 10 May, 2016 06:43 AM

    naveesharma's Avatar

    typo error aborting from web not from cli.

  11. Support Staff 9 Posted by Tasos Laskos on 10 May, 2016 06:45 AM

    Tasos Laskos's Avatar

    If this is about the issue in the other discussion we should discuss it there.
    Also, can you try doing the same scan from the CLI? That'll provide more feedback and help me understand what's going on.

    Cheers

  12. 10 Posted by naveesharma on 11 May, 2016 05:24 AM

    naveesharma's Avatar

    hi Tasos..

    arachni scanner is very slow. i have specify only 10 page limit and it is taking ages to complete that. any work around to make it fast.

  13. Support Staff 11 Posted by Tasos Laskos on 11 May, 2016 05:32 AM

    Tasos Laskos's Avatar

    It depends, are the server response times very high or do the web pages have high JS setTimeout() calls?
    This guide is a good start: http://support.arachni-scanner.com/kb/general-use/optimizing-for-fa...

  14. 12 Posted by naveesharma on 11 May, 2016 06:30 AM

    naveesharma's Avatar

    this is what i am running and it is keep on running from 1 hr and seems never ending loop. can you suggest what is wrong in the below scan.

  15. Support Staff 13 Posted by Tasos Laskos on 11 May, 2016 06:35 AM

    Tasos Laskos's Avatar

    I can't know what's going on just by the configuration, although enabling header audits will result in substantially longer scans; and the increased DOM depth limit will have a similar effect.

    If you could send me an e-mail with the real target so that I can perform an identical scan and see what happens then I may be able to help.

    Cheers

  16. 14 Posted by naveesharma on 11 May, 2016 06:52 AM

    naveesharma's Avatar

    tasos,

    i aborted the scan which was running from yesterday and was only for 100 pages. if you check the attached screenshot it says that audit pages are 10071. why is it so ? when i have set the max. audit page limit to 100.

  17. Support Staff 15 Posted by Tasos Laskos on 11 May, 2016 07:03 AM

    Tasos Laskos's Avatar

    That message is a little misleading, I should update it.

    What happens is that one page can have multiple DOM snapshots, so out of the 100 pages 10,000 snapshots were generated.
    This large amount is probably due to the increased DOM depth limit you configured.

    From the screenshot I can also see that you've significantly reduced the max request concurrency, which will make the scan much slower.
    In addition, the system still had to reduce the concurrency even further because the server was responding very slowly (almost 2 seconds) and that's a sign of stress.

    Basically, the server is very slow and there were a lot of things that had to be audited, so the scan took a very long time.

  18. 16 Posted by naveesharma on 11 May, 2016 07:15 AM

    naveesharma's Avatar

    now o while generating report in html it gives me error of "[ NoMethodError] undefined method 'authentication_type=' for #<Arachni::optionGroups::HTTP:0*000 Did you mean? authentication_password.

  19. Support Staff 17 Posted by Tasos Laskos on 11 May, 2016 07:16 AM

    Tasos Laskos's Avatar

    Can you please show me the entire error?

  20. 18 Posted by naveesharma on 11 May, 2016 07:20 AM

    naveesharma's Avatar

    check this out.

  21. Support Staff 19 Posted by Tasos Laskos on 11 May, 2016 07:21 AM

    Tasos Laskos's Avatar

    You passed an AFR report that was generated with the latest nightlies to an older version, right?
    That won't work, you should use the same version that generated the report.

  22. 20 Posted by naveesharma on 11 May, 2016 07:25 AM

    naveesharma's Avatar

    i used the same but it was asking me to copy the report at /usr/share/arachni/bin to generate report.When i copied the same it gives me the same error.

  23. 21 Posted by naveesharma on 11 May, 2016 07:51 AM

    naveesharma's Avatar

    it work after copying the bin file at the same location

  24. Support Staff 22 Posted by Tasos Laskos on 11 May, 2016 07:57 AM

    Tasos Laskos's Avatar

    It wouldn't have asked you to copy anything, it may have said that the report file wasn't found at the location you specified.
    And copying the executables in that way could result in unexpected behavior.

  25. 23 Posted by naveesharma on 11 May, 2016 08:12 AM

    naveesharma's Avatar

    then what should i do know. shall i remove bin from that location and do something else ?
    because copying the report at that location is not solving the error.

  26. Support Staff 24 Posted by Tasos Laskos on 11 May, 2016 08:44 AM

    Tasos Laskos's Avatar

    I'm not sure how your env looks now so to be safe remove all Arachni packages and re-extract the nightlies.
    Then pass the location of the report to the arachni_reporter executable.

  27. 25 Posted by naveesharma on 11 May, 2016 09:57 AM

    naveesharma's Avatar

    i am running the scan on one local application and specify to audit only 50 pages . also as per our previous discussion i have decreased the dom depth "--scope-dom-depth-limit 2" still it has not finished the scan yet. it looks like never ending process.

     Audited 51 pages.
     [~] Audit limited to a max of 50 pages.
    
     [~] Duration: 02:36:19
     [~] Processed 88960/88961 HTTP requests.
     [~] -- 11.349 requests/second.
     [~] Processed 218/219 browser jobs.
     [~] -- 5.124 second/job.
    
     [~] Currently auditing          http://intranet.net/intranet_talk_to_hr.htm
     [~] Burst response time sum     0.0 seconds
     [~] Burst response count        0
     [~] Burst average response time 0.0 seconds
     [~] Burst average               0.0 requests/second
     [~] Timed-out requests          1685
     [~] Original max concurrency    50
     [~] Throttled max concurrency   44
    
     [~] Status: Scanning
    
  28. Support Staff 26 Posted by Tasos Laskos on 11 May, 2016 10:01 AM

    Tasos Laskos's Avatar

    You may also want to set the --scope-dom-event-limit option, it's new and allows limiting the amount of events to be triggered for each page. It'll help keep the page snapshots low.

    However, I can still see that the server is responding very slowly.
    A healthy localhost/intranet server would return something like 150 requests per second, you're only getting 11.

  29. 27 Posted by naveesharma on 11 May, 2016 10:27 AM

    naveesharma's Avatar

    no --scope-dom-event-limit option

  30. Support Staff 28 Posted by Tasos Laskos on 11 May, 2016 10:28 AM

    Tasos Laskos's Avatar

    It's there, second from the bottom.

  31. 29 Posted by naveesharma on 11 May, 2016 10:35 AM

    naveesharma's Avatar

    ;-) missed it

    what should be the limit.

  32. Support Staff 30 Posted by Tasos Laskos on 11 May, 2016 10:36 AM

    Tasos Laskos's Avatar

    Since you're worried about performance I'd say something low like 10, this will result in less coverage though.
    The problem is the server, the webapp needs to be audited but if the server is slow the scan can't complete fast.

Comments are currently closed for this discussion. You can start a new one.

Keyboard shortcuts

Generic

? Show this help
ESC Blurs the current field

Comment Form

r Focus the comment reply box
^ + ↩ Submit the comment

You can use Command ⌘ instead of Control ^ on Mac