How to configure Arachni to be able to scan pages which contains navigation based on #/Tab paths
Hey!
I was trying to scan few pages which contains menu based on URLs with # tag:
<ul class="nav navbar-nav">
<li class="dropdown">
<a href="#" class="dropdown-toggle" role="button" aria-expanded="false">Manage <span class="caret"></span></a>
<ul class="dropdown-menu" role="menu">
<li>
<a href="#Tab1">{{ 'MENU_TAB1' | translate }}</a>
</li>
<li>
<a href="#Tab2">{{ 'MENU_TAB2' | translate }}</a>
</li>
<li>
<a href="#Tab3">{{ 'MENU_TAB3' | translate }}</a>
</li>
<li>
<a href="#Tab4">{{ 'MENU_TAB4' | translate }}</a>
</li>
<li>
<a ng-click="logout()" href="#">{{ 'MENU_LOGOUT' | translate }}</a>
</li>
</ul>
</li>
</ul>
<ul class="nav navbar-nav navbar-right">
<li>
<a ng-click="logout()" href="#">{{ 'MENU_LOGOUT' | translate }}</a>
</li>
</ul>
https://example.com/#/Login
https://example.com/#/Tab1
https://example.com/#/Tab2
https://example.com/#/Tab3
I was able to pass first step with logging to site by using url: https://example.com/#/Login (I'm using ruby script to handle form login panel), but Arachni wasn't able to navigate to mentioned pages with # in URL. I tried to replace this character with %23 but also not recognized properly.
I was trying even to store URLs in external file loaded by using
command:
--scope-extend-paths "./urls2.txt"
Also I enabled parameter:
--scope-include-subdomains
Example command:
./bin/arachni https://example.com/
--report-save-path=report.afr --plugin=login_script:script=login.rb
--plugin=metrics --scope-exclude-pattern="logout"
--http-request-concurrency=3 --http-request-redirect-limit=3
--browser-cluster-pool-size=3 --scope-dom-depth-limit=3
--scope-auto-redundant=10 --browser-cluster-job-timeout=20
--timeout=04:00:00 --scope-extend-paths "./urls2.txt"
--scope-include-subdomains
urls2.txt body:
https://example.com/#/Tab1
https://example.com/%23/Tab1
https://example.com/#Tab1
I wonder how I can scan page which contains dynamic navigation.
By using:
view-source:https://example.com/#/Login I can
see there only navigation buttons, because all objects are
dynamically read I think.
Could You try to help me with it? When I'm using Arachni for page built in dynamic way by using # characters crawler is not able to find other pages than listed. In generated HTML report from AFR file I can't find any URLs for #/ paths.
Kind Regards,
Łukasz
Showing page 2 out of 2. View the first page
Comments are currently closed for this discussion. You can start a new one.
Keyboard shortcuts
Generic
? | Show this help |
---|---|
ESC | Blurs the current field |
Comment Form
r | Focus the comment reply box |
---|---|
^ + ↩ | Submit the comment |
You can use Command ⌘
instead of Control ^
on Mac
Support Staff 31 Posted by Tasos Laskos on 24 Oct, 2016 07:53 AM
Can you check the nightlies and see if the issue persists?
32 Posted by Aladdin on 24 Oct, 2016 07:54 AM
Hey Tasos,
Yes, I'm going to check latest version(already downloaded) later today.
I will let You know about results later today or tomorrow.
Thanks in advance!
33 Posted by Aladdin on 25 Oct, 2016 05:16 AM
Processing stopped. Last logs:
I will try once again today
Btw. I tried to use 'Enter' to show menu where I can generate report, but doesn't work for me under Terminal on Mac OS El Capitan.
I used Ctrl+C and below message showed on terminal:
When I used once again Ctrl+C I was able to find listed issues and info about saved AFR file34 Posted by Aladdin on 25 Oct, 2016 08:03 AM
One clue. I spotted the same pending jobs when I was scanning application by using CSRF checks by using latest Arachni and below command:
Found 1 CSRF but stopped processing on:
I will try to scan application without CSRF check
--checks=*,-*csrf* or
--checks=*,-csrf
Support Staff 35 Posted by Tasos Laskos on 25 Oct, 2016 08:26 AM
I don't think it was to do with checks, this must be caused by the crawl.
Any chance I can get access to the webapp?
36 Posted by Aladdin on 27 Oct, 2016 08:54 AM
Hey Tasos!
I was checking it few times, etc, but unfortunately it's not possible to get access to application and not depends on me, really.
Also I wasn't able yet to find all pages(crawler), hopefully to end of week I will be able to test more Arachni in context of timeouts.
Good news is that I will share with You part of JS script which is responsible for displaying menu on page soon. I will let You know
Support Staff 37 Posted by Tasos Laskos on 07 Nov, 2016 06:03 PM
Discussion moved to e-mail.
Tasos Laskos closed this discussion on 28 Nov, 2016 03:24 PM.