RAWR - Rapid Assessment of Web Resources

RAWR - Rapid Assessment of Web Resources

RAWR is a python tool that is designed to make the process of web enumeration easy and efficient by providing pertinent information in usable formats.

It uses Nmap (live or from file), Metasploit, Qualys, Nexpose, or Nessus scan data to target web services for enumeration, then visits each host on each port with an identified web service and gathers as much data as possible.


  • A customizable CSV containing ordered information gathered for each host, with a field for making notes/etc.
  • An elegant, searchable, JQuery-driven HTML report that shows screenshots, diagrams, and other information.
  • A CSV Threat Matrix for an easy view of open ports across all provided hosts.
  • A wordlist for each host, comprised of all words found in responses. (including crawl, if used).
  • Default password suggestions through checking a service's CPE for matches in the DPE Database.
  • A shelve database of all host information. (planned comparison functionality)
  • Parses meta-data in documents and photos using customizable modules.
  • Supports the use of a proxy (Burp, Zap, W3aF)
  • Can take screenshots of RDP and non-passworded VNC interfaces.
  • Will make multiple web calls based on user-supplied list of user-agents.
  • Captures/stores SSL Certificates, Cookies, and Cross-domain.xml
  • Will notify via email or SMS when the scan is complete.
  • Customizable crawl of links within the host's domain.
  • PNG Diagram of all pages found during crawl
  • List of links crawled in tiered format.
  • List of documents seen for each site.
  • Automation-Friendly output (JSON strings)


  • nmap - at least 6.00 - required for SSL strength assessment
  • graphviz - site layout from crawl (optional)
  • python-requests - tested w/ 1.2.3, requires at least 0.13.3 (2012-07-12)
  • python-lxml - parsing xml & html
  • python-pygraphviz - site layout from crawl (optional)
  • phantomJS - tested with 1.9.1


./rawr.py [-n <range> (-p <ports> -s <port> -t <timing>)|-f <xml>|-i <list>]

          [-d <dir>] [--sslv] [-aboqrz] [--downgrade] [--json] [--json-min]

          [-e] [--title <title>] [--logo <file>] [--sqlite3] [--spider]

          [-u|-U] [--check-install|--force-install]

  --version          show program's version number and exit

  -h, --help         show this help message and exit

  -a                 Include all open ports in .csv, not just web interfaces.

  -f XMLFILE         NMap|Nessus|Nexpose|Qualys xml or dir from which to pull


  -i NMAP_IL         Target an input list.  [NMap format] [can't be used with


  -n NMAPRNG         Target the specified range or host.  [NMap format]

  -p PORTS           Specify port(s) to scan.   [default is


  -s SOURCEPORT      Specify a source port for the NMap scan.

  -t NMAPSPEED       Set a custom NMap scan timing.   [default is 4]


  --sslv             Assess the SSL security of each target.  [considered


Enumeration Options:
    -b               Use Bing to gather external hostnames. (good for shared


    -o               Make an 'OPTIONS' call to grab the site's available


    -r               Make an additional web call to get "robots.txt"

    --downgrade      Make requests using HTTP 1.0

    --noss           Disable screenshots.

    --spider         Enumerate all urls in target's HTML, create site layout

                     graph.  Will record but not follow links outside of the

                     target's domain.  Creates a map (.png) for that site in

                     the <logfolder>/maps folder.

Output Options:
    -d LOGDIR        Directory in which to create log folder [default is "./"]

    -q, --quiet      Won't show splash screen.

    -z               Compress log folder when finished.

    --sqlite         Put output into an additional sqlite3 db file.

    --json           stdout will include only JSON strings. Log folders and

                     files are created normally.

    --json-min       The only output of this script will be JSON strings to


Report Options:
    -e               Exclude default username/password data from output.

    --logo=LOGO      Specify a logo file for the HTML report.

    --title=TITLE    Specify a custom title for the HTML report.

Update Options:
    -u               Check for newer version of IpToCountry.csv and


    -U               Force update of IpToCountry.csv and defpass.csv.

    --check-install  Check for newer IpToCountry.csv and defpass.csv. Check

                     for presence of NMap and its version. Check for presence

                     of phantomJS, prompts if installing.

    --force-install  Force update - IpToCountry.csv, defpass,csv, phantomJS.

                     Also check for presence of NMap and its version.

     ./rawr.py -n scanme.nmap.org --spider
          Create log folders in current directory [./log_<date>_<time>_rawr/]
          Follow and enumerate links in the target's HTML as long as
          they're in the target's domain.  
          Will create a map of the site in the maps folder.

     ./rawr.py -n www.google.com -p all
          Pull data from web services found on any of the 65535 ports.

     ./rawr.py -f previous_nmap_scan.xml --sslv
          Use targets from a previous nmap scan, assessing the server's
          SSL security state.

     ./rawr.py -d scanfolder -n scanme.nmap.org -p 80,8080 -e
          Pull additional data about the server/site and its SSL cert from
          ports 80 and 8080, excluding default password data.  
          Stores results in ./scanfolder/log_<date>_<time>_rawr/ .

     ./rawr.py -i nmap_inputlist.iL -p fuzzdb -b -z
          Use an input list, checking the fuzzdb 'common web ports'.  
          Compress results into a .tar file.
          Use Bing to resolve DNS names of hosts.

     ./rawr.py -u
          Update 'Ip to Country' and 'default password' lists from the
          BitBucket repo.

No comments

Powered by Blogger.