Arachni for REST based WebSites

Gaurang Shah's Avatar

Gaurang Shah

25 May, 2015 04:54 AM

Hi Guys,

I am using arachni for testing our application, the problem I am facing is, arachni is not able to crawl all the links(URLS) of the websites.

My website uses REST services to get data from the server, so even if i visit different pages manually the URL remains same. I guess this could be the reason why arachni is not able to crawl all the different URLs possible.

If I manually put them in exteneded-path, arachni crawls it. Is there any way to do this apart from this ?

  1. Support Staff 1 Posted by Tasos Laskos on 25 May, 2015 07:47 AM

    Tasos Laskos's Avatar

    I'm afraid I'll need to check this out for myself in order to see what's going on.
    I could make this discussion private and you can send me the site info etc. or you can provide that info via e-mail: [email blocked]

    Cheers

  2. 2 Posted by Gaurang Shah on 25 May, 2015 08:25 AM

    Gaurang Shah's Avatar

    Hi Tassos,

    I think it would be good idea, right now I am following steps mention in following post
    http://support.arachni-scanner.com/kb/general-use/service-scanning

    so far I have done following things.

    1. export http_proxy=http://localhost:8282
    2. arachni https://10.10.170.155/webui/login --scope-page-limit=0 --checks=*,-common_*,-backup*,-backdoors,-directory_listing --plugin=proxy --audit-jsons --audit-xmls --plugin=login_script:script=login.rb --session-check-pattern=/.*Gaurang.*/ --session-check-url="https://10.10.170.155/webui/generic/getcurrentuser?_dc=143183311499...;
    3. visit some page manually (Not sure if this require)
    4. Stop server: http_proxy=http://localhost:8282 curl http://arachni.proxy/shutdown

    After this step, step 2 hangs saying server is being shutdown.

    If I use following command, where I don't mention login script, it works fine, however it shows 0 page audited.
    arachni https://10.219.170.155/webui/login --scope-page-limit=0 --checks=*,-common_*,-backup*,-backdoors,-directory_listing --plugin=proxy --audit-jsons --audit-xmls

  3. Support Staff 3 Posted by Tasos Laskos on 25 May, 2015 12:41 PM

    Tasos Laskos's Avatar

    Is this application live somewhere? I can't know if there's a configuration issue unless I try it.

  4. 4 Posted by Gaurang Shah on 25 May, 2015 12:54 PM

    Gaurang Shah's Avatar

    Hi Tasos,

    No this application is not live, let me know what exactly you want me to check and i will check that for you.

    however would you please let me know, if steps I am performing are correct or not ?

  5. Support Staff 5 Posted by Tasos Laskos on 25 May, 2015 01:00 PM

    Tasos Laskos's Avatar

    The idea when using the proxy is to train Arachni via your interactions with the browser.
    Whatever JSON traffic passes through the proxy it will be audited once the scan starts after shutting down the proxy.

    So step 3 is mandatory.

    In your case you don't need to export the http_proxy variable.

  6. 6 Posted by Gaurang Shah on 25 May, 2015 01:07 PM

    Gaurang Shah's Avatar

    And what about command, which command should i use, one with login script and without login script.

    would you let me know how exactly it works. Should i do something like this.
    1. Start arachni with proxy (not sure about command, with or without login script)
    2. set proxy in browser and visit pages
    3. stop proxy server

    now what should happen or what should i do? should i run the command to scan website for attack, if yes from where it will pickup the REST URL ? or will it start scan automatically ?

  7. Support Staff 7 Posted by Tasos Laskos on 25 May, 2015 01:14 PM

    Tasos Laskos's Avatar

    If the script is giving you problems you can remove it and login via your browser, but be sure to exclude resources that may log you out.

    Once you've finished browsing you can shut down the proxy via the panel you'll be seeing at the top of your window or with:

    http_proxy=http://localhost:8282 curl http://arachni.proxy/shutdown
    

    Once you shut down the proxy the scan will start from the URL you provided at the command line and will include any resources that were made visible via the proxy.

    I'm not sure how to explain it better, I'm sorry.

  8. 8 Posted by Gaurang Shah on 27 May, 2015 08:23 AM

    Gaurang Shah's Avatar

    HI Tasos,

    I am able to generate the report by following steps mentioned below, in the report it shows REST URL as well.

    however the problem is I am using arachni with gauntlt and I am not able to undestand how would I automate this process with gauntlt.

    Does arachni saves all the URLS somewhere ? if it's saving, i can create new file and provide that as exteneded_paths.

    Steps:
    1. run the following command
    arachni https://10.219.170.155/webui/login --scope-page-limit=0 --checks=*,-common_*,-backup*,-backdoors,-directory_listing --plugin=proxy --audit-jsons --audit-xmls 2. Set proxy in browser and visit URLS,
    3. stop proxy server.

  9. Support Staff 9 Posted by Tasos Laskos on 27 May, 2015 01:17 PM

    Tasos Laskos's Avatar

    I don't know how gauntlt works so I can't help you with that.
    Also, it's not the URLs that are important but the JSON traffic, there is something you can use to get the result you want but I'm afraid I forgot to update that plugin to handle JSON and XML.

    I'll update it and let you know once a nightly is up.

    Cheers

  10. Support Staff 10 Posted by Tasos Laskos on 28 May, 2015 04:05 PM

    Tasos Laskos's Avatar

    Nightlies are up.

    After you've finished training the system via the proxy, you'll be able to grab the input vectors via: http://arachni.proxy/panel/vectors.yml

    Then, instead of using the proxy plugin, you'll be able to load them using the vector_feedone, like so: --plugin=vector_feed:yaml_file=/some/path/vectors.yml

    The rest of the options should be the same as before, but you will need to set a valid session by either using the login_script plugin or providing a --http-cookie-jar.

  11. 11 Posted by Gaurang Shah on 01 Jun, 2015 06:55 AM

    Gaurang Shah's Avatar

    Hi Tasos,

    Thanks for generating the nightly build, I have installed it and now ready to test my web services. However I am still unclear about the steps.

    I am not able to understand how to generate vector.yml, do I need to create it manually, if yes then how ? or is there any way this proxy plugin creates vector.yml

  12. Support Staff 12 Posted by Tasos Laskos on 01 Jun, 2015 11:45 AM

    Tasos Laskos's Avatar

    I get the feeling that you're not paying attention to what I'm saying, it's the second thing I mentioned in my last post.

  13. Tasos Laskos closed this discussion on 30 Sep, 2015 02:57 PM.

Comments are currently closed for this discussion. You can start a new one.

Keyboard shortcuts

Generic

? Show this help
ESC Blurs the current field

Comment Form

r Focus the comment reply box
^ + ↩ Submit the comment

You can use Command ⌘ instead of Control ^ on Mac