Arachni vs other products?
I use commercial version of Burp everyday. I'm not satisfied with that tool, even if it's one of the best on the market and I'm still looking for something better. Recently, I've found out w3af, but right now I don't have enough time to test Arachni, W3af and many many other tools, so maybe you could help me in that case.
Could you, breifly, explain me in what Arachni is better than Burp or that w3af? Or maybe it's completely different? I assume you used these tools or other popular scanner+proxy --- WebScarab, OWASP ZAP etc. and it should be much easier for you to compare them than for me.
I would like to know basic differences or similarities, before I will start diving into Arachni :)
Comments are currently closed for this discussion. You can start a new one.
Keyboard shortcuts
Generic
? | Show this help |
---|---|
ESC | Blurs the current field |
Comment Form
r | Focus the comment reply box |
---|---|
^ + ↩ | Submit the comment |
You can use Command ⌘
instead of Control ^
on Mac
Support Staff 1 Posted by Tasos Laskos on 05 Mar, 2013 07:01 PM
Arachni is much more focused than that projects you mentioned, it's meant to be a fire-and-forget scanner (or scanner service).
Even though there is support for a Proxy (via the Proxy plugin) it's a very basic one, just enough to train the scanner or login to the webapp (and record a login sequence). No utilities to help you perform a manual pen-test.
If that's what you want (just a scanner) then I guess that Arachni would be high-up on your list (especially when not counting expensive commercial products).
If you want help when performing manual pen-tests (which is Burp's focus) then you'll find Arachni useless (at least from a UI perspective, some folks use Arachni as a Ruby lib to help with manual pentests with custom scripts and stuff).
You don't have to dive into Arachni though, you can try the nightly packages; they are self-contained, won't take any time to setup and contain some new fancy features which are scheduled for the upcoming release.
Extract the archive, have a look in the README file, fireup the web interface and start a scan (against some demo site like http://testfire.net) and while the scan is running have a look around the interface.
It'll take you about 15mins (depending on your bandwidth); if you like what you see that's cool and you can come back here and ask more questions and/or request features you'd like to see.
But, it all comes down to what you want, if it's just a scanner then people seem to find Arachni quite satisfying (especially with the new interface, which you'll find in the nightlies), if you want something that helps you perform a manual pen-test then you'll have to go back to using Burp.
Does the above help?
2 Posted by semyazz on 06 Mar, 2013 07:54 AM
thx for your answer. It cleared all my doubts.
I've read some info about Arachni, but I thought it's more focused on being a framework than a one-shot scanner. I know manual tests aren't as easy and I have to do some manual stuff, but I'm looking for some kind of framework that could be used in scripts and speed up the whole process.
So if you said that Arachni is fire-and-forget scanner, then could you compare it to commercial scanners (Qualyss, Nessus etc)? Are you gusy better or equal or far away :)? Have you done any tests?
Support Staff 3 Posted by Tasos Laskos on 06 Mar, 2013 04:16 PM
It's both, you can use the Framework to create your own scanner or just use the available interfaces. Most people either use the interfaces or the RPC API, very few need to go deep down to the core libs.
Think of it as
libcurl
vs thecurl
utility, the Framework islibcurl
and the interfaces are its drivers, ascurl
is tolibcurl
.Here's an example of using Arachni as a Ruby lib: https://gist.github.com/Zapotek/2960625
And it's gotten easier now, you don't even need to specify an
Auditor
class to the elements because they default to anonymous auditors to make working from the console easier, like:The homepage has some simpler, higher-level examples.
So even though I'm trying to offer a complete fire-and-forget system to end-users, by keeping the code clean and organized it can easily be used as a library to built custom stuff.
Now, as for comparing it to Nessus, you can't; as far as I know Nessus looks for known vulnerabilities of known web applications (along with system services and devices and whatnot).
Arachni does the exact opposite, it only performs dynamic analysis and doesn't look for known vulns in known vulnerable webapps.
As for Qualys, the big difference is that that Arachni doesn't yet support JS/AJAX and that it's more of a self-service sort of thing versus Qualys' cloud fondness.
(If you're a cloud fan or you want a distributed system then you can even setup a high-performance global grid of scanners with Archni if you so choose, so you're not limited in that respect either -- and you'll be able to control your own data.)
As for the lack of DOM/JS/AJAX support in Arachni, pretty much everyone uses either Selenium (which uses engines from either Firefox, Chrome or Opera) or one of these browsers directly in order to provide a DOM and JS support to their scanner.
That's good because support for that stuff is no walk in the park and it'd take a lot of effort to put these into place.
On the other hand, performance plays a huge part in Arachni and Selenium was built for QA, not fuzzing. Scanners perform hundreds of thousands (or millions under some circumstances) of requests and in these cases even a few milliseconds per request/analysis can add up to make a huge difference.
And Selenium really wasn't built for that sort of thing, imagine your browser trying to open a few hundred pages per second -- the horror!
Obviously, nowadays DOM/JS is a requirement and if you've got to do it then you've got to do it.
But, since Arachni is a F/OSS project I've got the luxury of not caring about markets and competitors and checkbox requirements so I've decided to take my time and do things right. I'll try to write my own custom, lightweight DOM to include in v0.5 to sidestep the performance penalty (and as a bonus, that'll allow Arachni to look much deeper into how the page is being manipulated).
Of course, there's the possibility of falling flat on my ass, in which case I'll use Selenium too but there's nothing to lose by trying to do my own thing.
As a final note, there are people out there who fire-up an Arachni Instance with the proxy plugin enabled and configure their QA tests (which use Selenum or something similar) to go through Arachni's proxy.
This way your QA tests train Arachni (so you can even disable crawling your webapp) and you kill 2 birds with 1 stone and save time.
(You don't seem to be in that boat but I figured I'd mention it just in case.)
Man...I'm chatty today, was the above, at all, helpful?
4 Posted by semyazz on 07 Mar, 2013 08:06 PM
woho, it's quite a big answer ;) yea it was helpful. I don't know if I get right that part with nessus and known vulnerabilities. I suppose you meant host scanning, ports etc. and CVE. yea it was bad example, my bad. It's different kind of scanner.
You didn't mention about w3af in which I'm particularly interested. Mainly, because it's Python and I don't know Ruby at all. Well it's always a good time to learn new lang.
so, have you compared Arachni to any other software on the market? Maybe to that w3af? How good is it? Performance/Accuracy.
yea, I'm cloud/distributed systems freak -- how do you know that:). It's great to hear that Arachni is prepared to run in a distributed environment.
I didn't know Qualys supports JS, what do you mean by DOM support?
I'm intrigued by "train Arachni". Have you implemented some sort of machine learning algorithms?
It all looks impressive. I'll try to find some time next week and test Arachni. So you suggest to try version from repo right? :). I think after that I will have more questions.
btw. how many people are working on Arachni? how active the development is?
THX!!!
Support Staff 5 Posted by Tasos Laskos on 07 Mar, 2013 09:07 PM
I'd say that it's better than W3AF but that opinion wouldn't be trustworthy since I'm the guy who wrote Arachni.
Have a look here for scanner comparisons: http://www.sectoolmarket.com/
That benchmark was done with an old version, the one in the repo has been greatly improved since then -- especially when it comes to RFI/LFI cases and false-positives in general.
But still, even that old version puts it pretty-much above of all other F/OSS scanners and in the company of the commercial big boys.
And performance-wise it's one of the best, but without it blindly DoS-ing the webserver, it will monitor server-health and adjust/throttle itself to maintain a stable connection. Of course, you can still end up with a dead server if it's fragile enough but there's only so much you can do...
Oh if you're a distributed systems freak you can have some serious fun with Arachni.
DOM is basically the tree structure which the HTML code represents and is maintained and exposed to the JS interpreter by the browser.
JS scripts operate on the DOM to modify the page dynamically (client-side) and update it with the results of AJAX requests and such and create new elements.
By train via the proxy I meant that you can teach Arachni about any given website by browsing it via the proxy.
The machine learning algorithms are used for custom 404 detection and for meta-analysis in order to identify possible false positives by cross correlating the logged issues with the webserver's behavior throughout the scan.
Although I don't like to use terms related to A.I. because I'm not very familiar with the field and because the A.I. line keeps freaking moving every-time we approach it.
This covers some interesting stuff about meta analysis and how it's used: http://arachni-scanner.com/blog/second-part-of-the-new-web-interfac...
(There were a lot of screenshots in that blog post to make my points clear but I'm having some problems with my Amazon S3 account at the moment so they are currently not displayed.)
I suggest you try the nightlies, they are cutting edge and will give you a better idea of what's going to be released as v0.4.2 soonish.
I'm actually the only developer with 2-3 other people testing it constantly and offering feedback and feature suggestions.
Development has always been quite active.
Tasos Laskos closed this discussion on 01 Jun, 2013 07:49 PM.