Enumerating Windows Domains with Powershell: DomainEnum v0.1.0

REPO:  https://github.com/pjhartlieb/post-exploitation/tree/master/powershell/DomainEnum

|    p.j.hartlieb
|    powershell post-exploitation
|    DomainEnum module v.0.1.0
|    2015.06.24
|    last verified 2015.06.24

[*] references

## [0] Reference: https://www.veil-framework.com/veil-powerview/
## [1] Reference: http://technet.microsoft.com/en-us/library/ff730967.aspx
## [2] Reference: http://msdn.microsoft.com/en-us/library/system.directoryservices.directorysearcher(v=vs.110).aspx
## [3] Reference: http://msdn.microsoft.com/en-us/library/system.directoryservices.directoryentry(v=vs.110).aspx

[*] background

The DomainEnum module is intended to support post-exploitation activities from within the user context on the target domain. It will enumerate domain computers, servers, users, emails, groups, group membership(s), sites, subnets, and subnets per site and save the results to one or more files. Whenever possible it will also enumerate computers, servers, users, groups, and group membership per OU. It's really intended to establish situational awareness once you drop onto "patient 0" and set you up to make the most of who you pivot to.

This module was created and tested with:
 Windows Powershell 2.0
 Windows 7 Professional SP1

[*] requirements

- n/a

[*] execution
- Create the following directory structure %USERPROFILE%\documents\windowspowershell\modules\DomainEnum
- Load the contents of the 'DomainEnum' directory into the new directory
- Open terminal
- Type
 >powershell -ExecutionPolicy Bypass -
 PS>import-module DomainEnum
 PS>get-command -module DomainEnum
- All output will be posted to C:\Users\Public\

[*] functionality

- Get-Homebase   identify DC and target domain
- Get-Pedigree   returns baseline information from the target host (patient 0)
- Get-Computer   returns all computers in the current domain
- Get-DC   returns the DCs and PDC for the current domain
- Get-Group   returns all groups in the current domain
- Get-GroupUser   returns all users in each group for the current domain
- Get-Server   returns all servers in the current domain
- Get-User    returns all users in the current domain
- Get-Email    enumerate all email for all users in a domain.
- Get-OU    returns all OUs in the current domain
- Get-OUUser    returns all users for each OU in the current domain
- Get-OUServer    returns all servers for each OU in the current domain
- Get-OUGroup    returns all groups for each OU in the current domain
- Get-OUComputer   returns all computers for each OU in the current domain
- Get-SiteServer   returns all servers for each site in the current domain
- Get-SiteSubnet   returns all subnets for each site in the current domain
- Get-GroupMember  enumerate all users in a specific group in the current domain
- Get-HighValueGroup  locate high value groups, enumerate users, harvest email addresses
- Get-DomainDump  returns all data from all functions

[*] thanks
- Lucius for helping to find those unholy syntax errors and figuring out to get it to execute hassle free.


yeyo.pl : Quickly harvest user data for a specific organization or keyword


[1] http://www.blackhatlibrary.net/Security101_-_Blackhat_Techniques_-_Hacking_Tutorials_-_Vulnerability_Research_-_Security_Tools:General_disclaimer <---- I borrowed content from here for the disclaimer below.


This script is intended to quickly and easily generate contact information for a specific keyword or organization based on the content returned from www.yatedo.com.  From what I can gather, Yatedo sources its information from publicly available resources on the web and concatenates it all together to present a reasonably accurate user profile.  There is no API and parsing the html seems straightforward.  If you are one of those folks who operates multiple puppet accounts on social media networks (which is a gross violation of the ToS and is not recommended) then this is a good way to whip up some seed accounts to connect with and/or pivot from.  The output is csv formatted as:

First Name, Last Name, Organization, Role


This script violates the ToS for www.yatedo.com and may get you banned.  This script is intended for educational purposes only.  I will not be held liable for a third party's use (or mis-use) of this information in any way.  Readers, end-users, and downloaders of content are responsible for their own actions.  Readers, end-users, and downloaders of content agree not to use content provided for illegal actions.  Content is provided as an educational resource for security researchers and penetration testers.


[*] https://github.com/pjhartlieb/recon-and-mapping/blob/master/yeyo.pl

## caveats
- I am not a programmer.  This script is not nearly as tight and clean as it could/should be.
- Improvements and TBD tasking is captured at the top of the script

## usage
> perl yeyo.pl -k <keyword or organization> -s <sleep time between harvesting contact data>

### output

> perl yeyo.pl -k "benchmade" -s 5

[*] Validating keyword/organization ...
[*] Validating sleep time ...
[*] Keyword entered "benchmade".
[*] Sleep times will be between 0 and 5.

[*] Retrieving frontpage for www.yatedo.com
[*] Yatedo appears to be up
[*] Sleeping for 2 seconds to avoid lockout

[*] Submitting search for benchmade
[*] Search successful. Content retrieved
[*] 20 unique links to users and additional results pages were found on the first page
[*] Cummulative results are here: http://www.yatedo.com/s/companyname%3A((benchmade))/normal
[*] Sleeping for 0 seconds to avoid lockout

[*] Retrieving cumulative results for benchmade
[*] Successful. Content retrieved
[*] 16 Links to users found
[*] 1 Links to additional results pages found

[*] Harvesting user data with sleep times between 0 and 5 seconds between records

[*] 11 suitable users found to date

[*] Retrieving target URLs from results page 2
[*] Successful. Content retrieved
[*] 12 Links to users found
[*] 0 Links to additonal results pages found

[*] Harvesting user data with sleep times between 0 and 5 seconds between records

[*] 11 suitable users found to date

[*] candidate user list

Chuck,Alf,Benchmade of Buffalo,Owner
Lyudmila,Ezersky,Benchmade Inc., Benchmade...,Human Resources Administrator
Dan,Janovicz,Benchmade Knife Company,Manufacturing Engineer
Vance,Collver,Benchmade Knife...,Product Development Manager, Process Development Technician,...
Enzo,Cardillo,Benchmade Leatherworks Inc.,President
Martin (Marty),Mills,Benchmade Knife Company,Manufacturing Engineer
Dillon,Daniel,Benchmade Knife Company,undef|past-role
Kathryn,Delaplain,Benchmade Knife Company,...,Customer Service, Warranty Repair Manager, Multiple
Joe,Verbanac,Benchmade Knife Company,...,Marketing Manager, Sr Art Director
Zack,Hilbourne,Benchmade Knife Company,...,Design Engineer, Mechanical Designer


I would like to port this over to python and share it with the recon-ng community.  Hopefully, folks will find it useful.


Passive Recon: Collapsing your target's wavefunction.

We recently had the opportunity to speak with the fine folks at the Charleston ISSA.  We had a great time and are thankful for the opportunity.  The abstract is included below.  The link is provided at the bottom.


Title: Passive Recon: Collapsing your target's wavefunction.

An open and accurate accounting of the available intelligence for an individual, organization, or business is typically an undervalued component of both offensive and defensive information security activities. From the defender.s perspective, it is important to understand how the source, content, and fidelity of publicly available data can affect the overall security posture of the organization. For the attacker, the gathering and analysis of publicly available data, which often includes usernames, emails, hostnames, subnets, technologies deployed, new product initiatives, employee habits, hobbies, and relationships, will provide actionable intelligence products that can be leveraged to gain a foothold in the target organization and provide the foundation for a successful attack. This presentation will cover intelligence sources, gathering and analysis methods, and the supporting toolset. Individual use cases will highlight how a specific piece of information can be developed into an actionable intelligence product that can then be incorporated into a larger attack plan. This presentation also provides suggestions for limiting, detecting, and mitigating against the information that is made available to the public.

Presentation is here


Generating email addresses from a non-uniform list of usernames


[*] http://ha.ckers.org/fierce/  <------ helped me to figure out how to provide command line options


I've been fortunate enough to be able to do full time pentesting for about the last year.  For each engagement, a good deal of my time is typically spent doing passive reconnaissance and mapping of the target organization.  The objective for this phase has always been the creation of actionable intelligence products that can support later phases of the engagement and more or less provide the foundation for a successful test.  For whatever reason (most likely inexperience), I've always been more successful building out a target list of individual users/usernames vice individual emails.  The username format is typically all over the map since the resources they're pulled from are scattered far and wide.  I needed a quick way to generate candidate emails in the event that I had incomplete information or was unsure of the final format for the email address.  I created a small perl script to accomplish the task.  Armed with the list, the tester may choose to phish the entire list regardless of whether or not the address exists and play the percentages.  Alternatively, this list may be used together with one or more smtp enumeration techniques; the end product being a list of clean and verified email addresses.


[*] https://github.com/pjhartlieb/recon-and-mapping/blob/master/genmail.pl

## caveats
- I am not a programmer.  This script is not nearly as tight and clean as it could/should be
- I have not attempted to incorporate every regex for every email format that a tester may come across
- Improvements and TBD tasking is captured at the top of the script

## usage
> perl email_generation_v003.pl -d <target domain> -f <username list>

### output
>  cat test.txt
Philip Hartlieb

>perl email_generation_v003.pl -d foo.com -f test.txt

[*]    File name entered "test.txt"

[*]    Target domain "foo.com"

[*]    File exists.

[*]    Executing.

[*]    Domain appears to be formatted correctly. Proceeding

[*]    The number of candidate usernames in the base array is: 1

[*]    The number of usernames converted to the "first.last" format is: 1

[*]    The number of usernames converted to the "first.mi.last" format is: 26

[*]    The number of usernames converted to the "LastFiMi" format is: 26

[*]    The number of usernames converted to the "FiMiLast" format is: 26

[*]    The number of unique email addresses generated is: 160

[*]    All emails written to "email_enumeration.txt"

[*]    Have a nice day

> cat email_enumeration.txt


The regex and final formatting can be changed as needed.  I've heavily commented the code to make this a bit easier.  


Parsing output with XMLSTARLET


[1] http://lanmaster53.com/ <------ Tim Tomes

[2] http://www.pentesticles.com/2012/05/we-have-port-scans-what-now.html   <------ Original post

[3] http://xmlstar.sourceforge.net/ <------ xmlstarlet project

[6] http://cirt.net/nikto2 <------ Nikto

[9] http://xmlstar.sourceforge.net/doc/UG/ <------ xmlstarlet documentation

[10] http://en.wikipedia.org/wiki/XSLT <------ wikipedia entry for XSLT

[11] http://www.w3schools.com/xpath/xpath_syntax.asp <------ basic XPath reference


 An interesting post came across the lanmaster53 [1]  twitter feed a while back.  It pointed to a permalink [2] on the pentesticles blog.  The blog presented a really tight way to parse through nmap data and kick out comma delimited output that could then be fed to any one of a number of tools.  IMHO the special sauce was the following one liner:

cat port_scans/hot-targets.tcp.services | xmlstarlet sel -T -t -m "//state[@state='open']" -m ../../.. -v address/@addr -m hostnames/hostname -i @name -o '  (' -v @name -o ')' -b -b -b -o "," -m .. -v @portid -o ',' -v @protocol -o "," -m service -v @name -i "@tunnel='ssl'" -o 's' -b -o "," -v @product -o ' ' -v @version -v @extrainfo -b -n -| sed 's_^\([^\t ]*\)\( ([^)]*)\)\?\t\([^\t ]*\)_\1.\3\2_' | sort -n -t.

I put this to use several times with no questions asked.  It worked like a charm and was especially useful when I was just beginning to sink my teeth into a network.   However, given the time, I always like to look under the hood and figure out why something actually works.  If I took the time to understand and appreciate xmlstarlet, I could then add it to my personal arsenal when handling any other bit of xml output for testing and/or reporting purposes. Other tools that kick out xml include Nikto [6], dnsrecon [7], Nessus [8] etc.   What follows below is an attempt to break that statement down and take a closer look at xmlstarlet [3], which is an awesome tool. 

Let me caveat all this by saying that I am late to the game and this is probably ancient history  for most.  Hence this post on examining xml data structure in nmap output.  Regardless …. onward 

A much better description of xmlstarlet can be found on the project page itself [3].   However, I like to keep the following first principles in mind when using tool.  Xmlstarlet will allow you to: [3]

-  "Browse tree structure of XML documents (in similar way to 'ls' command for directories)"; 
-  "Calculate values of XPath expressions on XML files (such as running sums, etc)"; and 
-  "Apply XSLT stylesheets to XML documents (including EXSLT support, and passing parameters to stylesheets)"

"XSLT (Extensible Stylesheet Language Transformations) is a language for transforming XML documents into other XML documents,[1] or other objects such as HTML for web pages, plain text or into XSL Formatting Objects which can then be converted to PDF, PostScript and PNG."  [10]

So in short, I am going to use this tool to dig around in an xml document for what I want and then transform it into something more useable. (ie. csv)


## generating the "hot-targets.tcp.services"  file

This was covered in an earlier pentesticles post [4].    This key bit from the larger nmap command "-oA port_scans/hot-targets.tcp.services" will result in output for all 3 major formats, including xml.  If you wanted to keep things nice and neat, "-oX" will give you xml only.  

To quickly generate some interesting files to play with I used the existing services on BT5-R3 [5].

### start services

Applications > Backtrack > Services > HTTPD > Apache start

Applications > Backtrack > Services > MySQLD> mysql start

### generate nmap data

> nmap -sSV -p80,43,25,3389,23,22,21,53,135,139,445,389,3306,1352,1433,1434,1157,U:53,U:161 n -vv -oA hot-targets.tcp.services

This will kick out several files in your CWD.  The target file is "hot-targets.tcp.services.xml".  There were only 2 open ports (80 and 3306).  Everything else was closed.

Before breaking down each of the commands above, I'll take a quick look at the structure of the nmap xml output.    The commands below will be helpful when crafting a specific search later on.

## display all xml elements in nmap output.  A subset is shown below.
>   xmlstarlet el hot-targets.tcp.services.xml 


## display all xml elements and attributes.  A subset is shown below.
>   xmlstarlet el -a hot-targets.tcp.services.xml 


## display all xml elements, attributes, and attribute values.  A subset is shown below.
>   xmlstarlet el -v hot-targets.tcp.services.xml 

nmaprun/host/ports/port[@protocol='tcp' and @portid='21']
nmaprun/host/ports/port/state[@state='closed' and @reason='reset' and @reason_ttl='64']
nmaprun/host/ports/port/service[@name='ftp' and @method='table' and @conf='3']

Now we can dig into the pentesticles command structure and view whats being executed at each step. 

##chunk 1 (With nesting points listed.  This will be addressed a bit later. They are listed here for completeness).
xmlstarlet sel -T -t -m "//state[@state='open']" -m ../../.. -v address/@addr -m hostnames/hostname -i @name -o '  (' -v @name -o ')' -b -b -b -o ","

### -T -t -m "//state[@state='open']"
------> "sel" This is the command line option used to select or query data.

------> "-T"  Specifies that the output will be text.

------> "-t"  Specifies that a template will be used. I am assuming that this makes reference to the XSLT templates mentioned above.  

------> "-m"//state[@state='open'] " " This means that xmlstarlet is looking to match a specific XPath expression.  It will find every child element "state" whose "state" attribute is set to "open".  This will match all child elements regardless of the parent.  This flows from the "//" Xpath syntax [11].  This is the 0th level of nesting.

###   -m ../../.. -v address/@addr
To better understand this next portion of the expression, it helps to understand where the previous portion left off in the xml document.  The first "-m" expression examined the "@state" attribute of the the "state" element  from the following Xpath expression "nmaprun/host/ports/port/state".  With this I'm mind, the next step is discussed below.

------> "-m ../../.. "  This seems to be very similar to *nix command "cd ../../..", which brings the user up into successive parent directories.  With this analogy in mind, I'm assuming that this xmlstarlet command will walk up the path from "state" (child) to "port" (parent/child) to "ports" (parent/child) and finally to the "host" (parent/child) element for which the "@state" attribute was = "open".  The next step is to access and output additional attribute data for the parent host whose "@state" attribute was = "open". This is the 1st level of nesting.

------> "-v address/@addr"   This will print the value contained in the "address/@addr" attribute for each host whose "@state" attribute was = "open".

###  -m hostnames/hostname -i @name -o '  (' -v @name -o ')' -b -b -b -o ","
------> "-m hostnames/hostname" Now for each host whose "@state" attribute was = "open" , this expression matches the Xpath expression "hostnames/hostname".  This is the 2nd level of nesting.

------> "-i @name -o '  (' -v @name -o ')'"  When there is a match, then the "-i" switch looks to see if a "@name" attribute exists.  If it does, the command line option "-o" specifies to output ' (' + @name + ')'. eg.  (dogballs)  if "dog balls" was the value of the "@name" attribute for the "hostname" element.  This is the 3rd level of nesting.

------> " -b -b -b -o "," "  The command line option "-b" is used to break out of the nesting that occurs for any "-m" or "-i" command line option used after the first "-m".  For the expression we've been examining thus far,  there are two additional "-m" and one "-i" used after the first "-m". Therefore, 3 "-b" breaks are required.   It would appear that "-v" does not apply any nesting. The "-o" option is used to output a literal comma.  

Using 3 "breaks" will return the xmlstarlet context back to the following element:   "nmaprun/host/ports/port/state ".  Again, this takes the xmlstarlet context back to all "state" elements whose "state" attribute is set to "open".  Now move on to the next chunk of the xmlstarlet expression.

##chunk 2 (with nesting points listed)
-m .. -v @portid -o ',' -v @protocol -o "," -m service -v @name -i "@tunnel='ssl'" -o 's' -b -o "," -v @product -o ' ' -v @version -v @extrainfo -b -n -|

### -m .. -v @portid -o ',' -v @protocol -o ","  
------> -m ..   This instructs xmlstarlet to go to the parent element ("nmaprun/host/ports/port") for the current context ("nmaprun/host/ports/port/state").  So now xmlstarlet is examining the attributes for the port element for all "state" elements whose "state" attribute is set to "open".  As the "-m" option was again used, bear in mind that this will require a break "-b" later on (x1). This is the 1st level of nesting.

------> -v @portid -o ',' -v @protocol -o ","  For the parent element now print the "@portid" and "@protocol" attributes for the "port" element and follow each with a literal comma.   

### -m service -v @name  
------> -m service  This instructs xmlstarlet to match the "service" child elements (" nmaprun/host/ports/port/service ") for the current context, which has the element "nmaprun/host/ports/port" as the parent.    Again, as the "-m" option was used, this will require a break "-b" later on (x2).  This is the 2nd level of nesting.

------>  -v @name  Print the "@name" attribute for each "service" element matched.

###  -i "@tunnel='ssl'" -o 's' -b -o "," -v @product -o ' ' -v @version -v @extrainfo -b -n
 ------> -i "@tunnel='ssl'" -o 's' -b -o ","   If there is a match, then the "-i" switch looks to see if a "@tunnel" attribute exists.  Since the "-i" option was used, this will require a break "-b" later on (x3) and in this case it happens immediately.  If the "@tunnel attribute is found, then it will output a literal 's', break out of the third level of nesting, which returns xmlstarlet to the following element  "nmaprun/host/ports/port/service".  Finally, it prints a literal comma.  

------> -v @product -o ' ' -v @version -v @extrainfo -b -n   For the "service" element print the "@product"  attribute , a literal space, the "@version" and "@extrainfo" attributes, then break out of the second level of nesting returning to the following element  "nmaprun/host/ports/port", and finally print a "newline".

Right now, with nothing else,  the output appears as follows:  (localhost),80,tcp,http,Apache httpd 2.2.14(Ubuntu)  (localhost),3306,tcp,mysql,MySQL 5.1.66-0ubuntu0.10.04.2

##chunk 3
 sed 's_^\([^\t ]*\)\( ([^)]*)\)\?\t\([^\t ]*\)_\1.\3\2_' | sort -n -t.

sed 's_^\([^\t ]*\)\( ([^)]*)\)\?\t\([^\t ]*\)_\1.\3\2_' 

^\([^\t ]*\) <--- all leading characters except a tab followed by 0 or more characters.  This is back reference \1

\( ([^)]*)\) <--- all characters except a ")" followed by 0 or more characters followed by ")".  This is back reference \2 

\?\t <--- a literal "?" followed by a tab

\([^\t ]*\) <---all characters except a tab followed by 0 or more characters.  This is back reference \3

\1.\3\2 <--- new order for back referenced groups 

The addition of the third chunk of the command results in identical output.  At this point I don't think I have had enough experience with the larger command to understand the purpose of chunk #3.  If anyone else knows, please post here.


I hope there is enough here to attack the next piece of xml that comes our way. 

Have a nice day.