Recon at scale

I just wanted to share some things I have learned recently about doing recon at scale. Recon can be pretty straightforward with a small number of targets but when you start reaching thousands or tens of thousands of targets it becomes important to automate, organize, and parse out the important information you need.

Nmap scans at scale

Typically when you run an Nmap scan you can simply add all of the arguments that are necessary to handle the task at hand. However if you are going to run these scans on a scheduled basis or repeatedly you may want to find some form of automation to complete this task. You may also want to have multiple templates for different types of scans.

I was able to achieve all of these things with ruby-nmap

An example template can be seen below:

In the above example I set all of the arguments I want Nmap to run using the ruby-nmap format. Something that is not in the readme file is a list of all the arguments and there conversion in ruby-nmap. This can be found here and is very useful for setting up scan templates. The next thing I setup was the output of the scan to xml with a date and time stamp added to the name of the file. This can be extremely useful for scheduled reports. The list of targets is passed as a command line argument and the port numbers are placed in the ports variable. You can easily modify the script to do very specific scans and save each of them as a separate script. These can then be run in an automated fashion or started by simple initiating the script as opposed to typing out long Nmap commands.

Working with Output

I explored many types of output from Nmap to work with at scale but xml is the one I found to be the most useful for the following reason:

  1. It’s the import format for the Metasploit database
  2. It can be opened up in a web browser and used for reporting purposes
  3. This is a format Ruby-nmap can parse very easily

One of the first things I discovered when trying to report on findings is that if you import xml scans into the Metasploit database it will correlate information for you such as the targets services, ports, and state. Then by exporting it back out as a CSV it can be easily filtered in any spreadsheet software. See the example below:

Sample Metasploit Output

Working with output that can be filtered is extremely useful when dealing with a large number of targets or when looking for very specific types of targets. I attempted a few different forms of grepping out the information I needed but in the end I found this to be the easiest way to work with my results. It was also the easiest way to allow others to work with the report.

The ability to open these scans in a web browser is also extremely useful when reporting or presenting the report. This can be seen in the example below with such useful information as what arguments were used in the scan:

Sample Nmap XML

The final way I chose to work with output is by using the parser built into ruby-nmap. With this parser you can choose to for example look for just port 80. A simple example of this can be seen below:

With this format you can manipulate the data however you like and get information from the xml reports quickly. You can also get information from different reports by passing the report in as an argument.

I recently started a github repository for the scanners and parsers that can be found here

Conclusion

I found that when working with a large number of hosts like I am required to do daily it becomes necessary to have a few new skills handy. These are just a couple of the methods I use to automate scans, report, and parse out information. There were also a number of tools that truly stood the test of scale that are listed below. I will add tools as I use them to test a large number of targets and verify that they work without issues.

Tools that work at scale

This is a list of tools that I used at scale and passed the test:

  • Nmap (This tool has never failed me)
  • Ruby-nmap (The gem worked with a large number of targets without fail)
  • Eyewitness (Scanned over 50k url’s with output to report)

 

Post navigation