The purpose of Dnsenum is to gather as much information as possible about a domain. The program currently performs the following operations:
1) Get the host's addresse (A record).
2) Get the namservers (threaded).
3) Get the MX record (threaded).
4) Perform axfr queries on nameservers and get BIND versions(threaded).
5) Get extra names and subdomains via google scraping (google query = "allinurl: -www site:domain").
6) Brute force subdomains from file, can also perform recursion on subdomain that have NS records (all threaded).
7) Calculate C class domain network ranges and perform whois queries on them (threaded).
8) Perform reverse lookups on netranges ( C class or/and whois netranges) (threaded).
9) Write to domain_ips.txt file ip-blocks.
Lets begin:-
- Firstly open the dnsenum tool from Backtrack >> Information Gathering >> Network Analysis >> DNS Analysis >> dnsenum
- Basic usage of dnsenum is ./dnsenum.pl 
- As you can see in the image, it will tell you all the available Host's addresses, Name Servers and Mail (MX) Servers and it will also try to find zone transfer using listed name servers.
- For Google Scraping, type ./dnsenum.pl -p 1 -s 1 example.com, but this option doesn't works for me.
- -p is used to tell dnsenum, number of google search pages to process when scraping names.
- To bruteforce, type ./dnsenum.pl -f dns.txt example.com
Here is all the options, which are available in dnsenum tool 
Usage: dnsenum.pl [Options]  
[Options]:
Note: the brute force -f switch is obligatory.
GENERAL OPTIONS:
  --dnsserver   Use this DNS server for A, NS and MX queries.
  --enum  Shortcut option equivalent to --threads 5 -s 20 -w.
  -h, --help  Print this help message.
  --noreverse  Skip the reverse lookup operations.
  --private  Show and save private ips at the end of the file domain_ips.txt.
  --subfile  Write all valid subdomains to this file.
  -t, --timeout  The tcp and udp timeout values in seconds (default: 10s).
  --threads  The number of threads that will perform different queries.
  -v, --verbose  Be verbose: show all the progress and all the error messages.                         GOOGLE SCRAPING OPTIONS:
  -p, --pages  The number of google search pages to process when scraping names, 
   the default is 20 pages, the -s switch must be specified.
  -s, --scrap  The maximum number of subdomains that will be scraped from Google.
BRUTE FORCE OPTIONS:
  -f, --file  Read subdomains from this file to perform brute force.
 a (all)  Update using all results.
 g  Update using only google scraping results.
 r  Update using only reverse lookup results.
 z  Update using only zonetransfer results.
  -r, --recursion Recursion on subdomains, brute force all discovred subdomains that                         have an NS record.
WHOIS NETRANGE OPTIONS:
-d, --delay  The maximum value of seconds to wait between whois queries, the                         value is defined randomly, default: 3s.
  -w, --whois  Perform the whois queries on c class network ranges.
REVERSE LOOKUP OPTIONS:
  -e, --exclude  Exclude PTR records that match the regexp expression from reverse lookup results, useful on invalid hostnames.
OUTPUT OPTIONS:
  -o --output  Output in XML format. Can be imported in MagicTree (www.gremwell.com)
No comments:
Post a Comment