tag:support.arachni-scanner.com,2012-07-01:/discussions/suggestions/13-array-of-proxys-array-of-user-agent-in-argumentArachni: Discussion 2012-11-06T16:40:11Ztag:support.arachni-scanner.com,2012-07-01:Comment/193412122012-10-05T14:56:53Z2012-10-05T14:56:53ZArray of proxys, array of user agent in argument<div><p>I've got good news and bad news...</p>
<p>Bad news:</p>
<ul>
<li>I won't support that functionality.</li>
</ul>
<p>Good news:</p>
<ul>
<li>The global HTTP options actually take effect per request.</li>
<li>You can intercept and modify ALL queued HTTP requests, thus you
can alter the user-agent and proxy options for any request, even
for ones that other components perform.</li>
</ul>
<p>So basically you can do what you want from your plugin using
something like:</p>
<pre>
<code>http.add_on_queue do |request, _|
request.proxy = 'my.proxy.com:8080'
request.headers['User-Agent'] = 'MyUA/0.1'
end
http.get 'http://test.com' do |response|
p response.request.proxy
#=> "my.proxy.com:8080"
p response.request.headers['User-Agent']
#=>"MyUA/0.1"
end
http.run</code>
</pre>
<p>Sound good?</p></div>Tasos Laskostag:support.arachni-scanner.com,2012-07-01:Comment/193412122012-10-05T14:58:59Z2012-10-05T14:58:59ZArray of proxys, array of user agent in argument<div><p>Forgot to mention that it won't affect the session as cookies
will remain intact -- unless the web devs decided to depend on the
IP address or the User-Agent for session maintenance for some
reason.</p></div>Tasos Laskostag:support.arachni-scanner.com,2012-07-01:Comment/193412122012-10-05T15:06:09Z2012-10-05T15:06:10ZArray of proxys, array of user agent in argument<div><p>that's good news :)</p></div>Beunwatag:support.arachni-scanner.com,2012-07-01:Comment/193412122012-11-05T13:23:08Z2012-11-05T13:23:08ZArray of proxys, array of user agent in argument<div><p>This work fine a month ago, but now it seems like the
<code>http.add_on_queue</code> was never called :( .<br>
I use the distributed crawl branch, maybe its a clue.<br>
I call arachni this way :<br>
<code>bundle exec arachni http://mysite.tld --module=- -g
--plugin=submarine</code></p>
<p>This is my plugin :</p>
<p>` class Arachni::Plugins::Submarine <
Arachni::Plugin::Base</p>
<pre>
<code>@uas = []
@proxys = []
def run
@uas = open('uas.txt').read.split
@proxys = open('mpp.txt').read.split
#check la proxy validity
@proxys.each do |pro|
pr = pro.split(':')
proxy_class = Net::HTTP::Proxy(pr[0], pr[1], pr[2], pr[3])
proxy_class.start('http://www.google.com') {|http|
response = http.head('/index.html')
@proxys.delete(pro) if response.code.to_i != 200
}
end
http.add_on_queue do |request, _|
pr = @proxys.sample
request.proxy = pr
request.headers['User-Agent'] = @uas.sample
end
http.run
end
def self.info
{
name: 'Submarine',
description: %q{intent to keep under the radar by changing the ua and ip randomly, warning this plugin must be load the first},
author: 'Benoit Chevillot <benoit@chevillot.org>',
version: '0.1'
}
end</code>
</pre>
<p>end<br>
`</p></div>beunwatag:support.arachni-scanner.com,2012-07-01:Comment/193412122012-11-05T13:37:56Z2012-11-05T13:37:56ZArray of proxys, array of user agent in argument<div><p>Nice catch, will sort it out today.</p></div>Tasos Laskostag:support.arachni-scanner.com,2012-07-01:Comment/193412122012-11-05T16:04:05Z2012-11-05T16:04:05ZArray of proxys, array of user agent in argument<div><p>First of all, your plugin had some issues, here's a cleaned up
version:</p>
<pre>
<code>class Arachni::Plugins::Submarine < Arachni::Plugin::Base
# run it on all instances
is_distributable
def prepare
# pause the framework while we setup our hooks
framework.pause
@uas = open('uas.txt').read.split
@proxies = open('mpp.txt').read.split
# check la proxy validity
@proxies.dup.each do |pro|
begin
Net::HTTP::Proxy( *pro.split( ':' ) ).
start( 'http://www.google.com' ) do |http|
next if http.head('/index.html').code.to_i == 200
raise "Proxy #{pro} seems dead."
end
# if we got here either the proxy server is dead or something
# more horrible has happened, bottom line is the proxy needs to
# be removed.
rescue
@proxies.delete( pro )
end
end
end
def run
http.add_on_queue do |request, _|
request.proxy = @proxies.sample
request.headers['User-Agent'] = @uas.sample
end
end
def cleanup
framework.resume
end
def self.info
{
name: 'Submarine',
description: %q{intent to keep under the radar by changing the ua and ip randomly, warning this plugin must be load the first},
author: 'Benoit Chevillot <[email blocked]>',
version: '0.1',
priority: 0 # run first
}
end
end</code>
</pre>
<p>Secondly, did you expect to run a distributed crawl using the
command you mentioned?<br>
Because that won't happen, the <code>arachni</code> CLI will just
run a simple single-node direct scan.<br>
To perform a distributed use this code example: <a href=
"https://github.com/Arachni/arachni/issues/207#issuecomment-10066220">
https://github.com/Arachni/arachni/issues/207#issuecomment-10066220</a></p>
<p>Just to be clear, I'm not sure how it'll work with a distributed
scan because I haven't gotten to that point yet.<br>
And to be even clearer I didn't actually test the code I pasted
either but it looks about right.</p></div>Tasos Laskostag:support.arachni-scanner.com,2012-07-01:Comment/193412122012-11-05T16:44:35Z2012-11-05T16:44:38ZArray of proxys, array of user agent in argument<div><p>Nice refactoring, your my master !</p>
<p>2 details :<br>
<code>framework.resume</code> is needed at the end of prepare, if
not the crawl never start</p>
<p><code>start( 'www.google.com' )</code> instead of <code>start(
'http://www.google.com' )</code> , if not it give error 400 (dunno
why, didnot take the time to check)</p>
<p>For the moment I just use it like before with just a single
node, and with the 2 above adjustement it work like a charm !</p>
<p>Feel free to distribute this plugin if you want, it can be very
usefull (it is for me)</p></div>Beunwatag:support.arachni-scanner.com,2012-07-01:Comment/193412122012-11-05T16:58:15Z2012-11-05T16:58:15ZArray of proxys, array of user agent in argument<div><p>Lol my bad, that method should have been called
<code>clean_up</code> so do put <code>framework.resume</code> there
to be safe, otherwise you might miss some requests.</p>
<p>As for these sort of plugins working properly with the new
crawler I'll take care of that tomorrow.</p>
<p>However, I don't like to encourage people to use proxies with
Arachni so I won't be including it in the repo.</p></div>Tasos Laskostag:support.arachni-scanner.com,2012-07-01:Comment/193412122012-11-06T16:40:11Z2012-11-06T16:40:11ZArray of proxys, array of user agent in argument<div><p>OK, fixed the problem, plugins are now distributed properly even
during the crawl phase.<br>
Of course, the code is still considered unstable so if you come
across issues please let me know.</p></div>Tasos Laskos