Monitoring DHCP using ELK

I am a bit unsure what to call this blog post as it is both monitoring and analyzing DHCP network service that I am going to be focusing on. I guess I will stick with monitoring DHCP using ELK for now. So one of the important things CIS controls talks about is to know the clients/ devices connected to you network. This holds true both for your guest networks as well as production networks. The three CIS controls that are relevant in this blog post are

  1. Utilize an Active Discovery Tool
    Utilize an active discovery tool to identify devices connected to the organization’s network and update the hardware asset inventory.
  2. Use a Passive Asset Discovery Tool Utilize a passive discovery tool to identify devices connected to the organization’s network and automatically update the organization’s hardware asset inventory.
  3. Use DHCP Logging to Update Asset Inventory Use Dynamic Host Configuration Protocol (DHCP) logging on all DHCP servers or IP address management tools to update the organization’s hardware asset inventory.

So this basically translates to, audit and monitor devices connected to your network. If you know this, you also know what the trends are i.e. when normally devices connect to you network, when is the lease assignment highest during the day and how many renews occur during the day or week. What about having trend reports which are monthly or event quarterly? Is that even possible? It turns out, by using modern tools this is quite possible. You do not need to trust me blindly by reading these claims, lo and behold

DHCP events Dashboard using ELK

DHCP monitoring  and analysis with Elasticsearch  and Kibana Dashboard
DHCP monitoring and analysis with Elasticsearch and Kibana Dashboard

This Elastic search dashboard shows the number of requests that have been made to DHCP servers during a week in June 2020. It further shows, which clients made these requests along with whether the request was granted a lease, if it was dropped, if a NACK response was sent or simply if a renew offer was presented to the client. I will not be focusing on showing the troubleshooting or DHCP threat hunting techniques in this post, but will be showing you how to set up a solution that fulfills CIS control requirements. By troubleshooting and threat hunting I mean actions like trying to find a specific client and dig deep into details related to that specific client. This I shall keep for another post.

Rightly so, you should have a network scanning tool to identify the clients that are not obtaining IP address from DHCP servers too, but the normal problem is, needle in the hay stack problem that we are solving. So basically the Kibana dashboard shown here consists of following visualizations or components.

DHCP events vertical bars using ELK:

DHCP monitoring with elasticsearch Dashboard Verticalbars
DHCP monitoring with elasticsearch Dashboard Verticalbars

This image shows all clients that contacted the DHCP servers to obtain a lease. The key lies under the bucket sections which shows the eventType. The series is further split by Machine/Client name.

DHCP events Heat map using ELK:

DHCP monitoring with elasticsearch Dashboard Heatmap
DHCP monitoring with elasticsearch Dashboard Heatmap

The image above shows same information as vertical bar view. However intension of having the image here is to show how you can create the heat map, this is illustrated under the buckets section.

DHCP events Pie charts using ELK:

DHCP monitoring with elasticsearch Dashboard pieChart
DHCP monitoring with elasticsearch Dashboard pieChart

In my opinion the Pie charts are the best represetation of DHCP information if you are troubleshooting as when you choose the object from the list on the lest the related events get highlighted which makes this visually very very impression and intuitive to work with. DHCP events Timeline using ELK

DHCP monitoring with elasticsearch Dashboard TimeLion
DHCP monitoring with elasticsearch Dashboard TimeLion

The timelion representation is be to represent the trends data or to perform comparative analysis. You could for example easily have a timeline showing all events from today compared with yesterday, or last week(same day). There is a lot of fun things that can and should be done here. In the left Windows the expression shows the query that has been utilized to pull the data. It might differ slightly for your environment, but nevertheless example provided here should be (hopefully) good guidance.

Solution description

The solution consits of Elasticsearch, Lostash, Kibana (ELK stack) along with monitoring or audits logs from DHCP along with some Powershell magic. I will be sharing all these components. I am a bit unsure if I would be able to share the .exe file here on the site, as it blocks most of executables, however, as you will get access to the Powershell code you would easily be able to create the .exe file yourself or just create a scheduled task to perform the same action i.e. to run the script every five minutes or so. You can also choose to run it far less frequently.

Auditing DHCP servers

In order to make this solution work you need to enable auditing on DHCP servers. For most of you who are using Microsoft DHCP services the chance are very high that auditing is already enabled. But you need to make sure that the file size is sufficient according to your environment, more on this a bit further down.

Enabling and checking DHCP auditing

By default Windows server write audit logs to DHCP folder found under System32 folder. This is illustrated in the image below

DHCP monitoring Windows logs.png
DHCP monitoring Windows logs.png

DHCP audit log files is by default not set that much and if memory server me right, it is 60 or 80 MB. If you have a decent number of devices on you network that chances are that these log files sizes are too low. You should consider increasing the size of these log files. Furthermore, if you are using scope fail over between two DHCP servers, you will also have a large number of packet drop events as the lease would be granted by the partner DHCP server. To accommodate for this a large file size is recommended. When these log files fill up, the DHCP server stops logging more data and reuses or overwrites the same file the following week.

“Size of DHCP log files is pivotal! Set it according to your environment and needs!”

In order to set the log file size you can use a Powershell command, one liner

Get-DHCPServerAduitLog
Set-DhcpServerAuditlog -enable -maxMBFileSize 500

Now that auding the DHCP servers has been sorted out, we need to divert our focus on how to extract information from the logfiles and make it useable withing ELK stack. This can easily be done by using the Powershell script provided below

#Function definitions
Function GetDhcpLogFiles ()
{
#Check day and return logfile only for that day!
$dateday = (Get-Date).DayOfWeek
$dateday = $dateday.ToString()
$dateday = "-"+ $dateday.Substring(0,3)
#Write-Host $dateday -ForegroundColor Red
#Provides logfiles location
$files = Get-ChildItem C:\Windows\System32\dhcp\
    foreach ($file in $files)
    {
    #Write-Host $file.Name -ForegroundColor Yellow
    if(($file.Name -match "$dateday") -and ($file.Name -match "DhcpSrvLog"))
        {
        $currentFile = $file.Name
        $rawContent = Get-Content C:\Windows\System32\dhcp\$currentFile | Select-Object -Skip 33
        #Write-Host $currentFile -ForegroundColor Red
        $testfldrdhcp = Test-Path -Path C:\Temp\Dhcp
            #Write-Host $testfldrdhcp
            if($testfldrdhcp -ne $true)
                {
                New-Item -ItemType directory -Path C:\Temp\Dhcp\ -Force
                }
           Remove-Item C:\Temp\Dhcp\* -Recurse -Force
           Add-Content C:\Temp\Dhcp\$currentFile -Value $rawContent -Force
           $logContent = Import-Csv -Path C:\Temp\Dhcp\$currentFile -Delimiter ","
        }
    }

Return $logContent
}


function CreateFolders()
{
$test = Test-Path C:\Windows\DHCP-Archeive
if($test -ne $true)
    {
    New-Item -ItemType directory -Path C:\Windows\DHCP-Archeive -Force
    }
$mydate = Get-Date -format "MMM-yyyy"
$mydate = $mydate.ToString()

#Check if folder with month and year exists if not create one
$test = Test-Path C:\Windows\DHCP-Archeive\$mydate

if($test -ne $true)
    {
    New-Item -ItemType directory -Path C:\Windows\DHCP-Archeive\$mydate -Force
    $logfileDestination = "C:\Windows\DHCP-Archeive\$mydate"
    }
    else
    {
    $logfileDestination = "C:\Windows\DHCP-Archeive\$mydate"
    }
   
Return $logfileDestination
}




##############################################
#################Main routine#################
##############################################
$loop = $true

while($loop -eq $true)
{
Start-Sleep -Seconds 1800

$destinationFolder = CreateFolders #Call function to check and create folder
$logContent = GetDhcpLogFiles #Call function to fetch logfiles and create archeive files

$limit = $logContent.Count

$mydate = Get-Date -Format "ddMMyy"
$day = (Get-Date).DayOfWeek 
$mydate = $mydate.ToString()
$day = $day.ToString() 
$day = $day.Substring(0,3)
$filename = $mydate+"-"+$day


#Write-Host $filename
#Write-Host "Time," "ID," "Description," "IPAddress," "Name," "MACaddress" -ForegroundColor Green
[array]$leaseInfo = {}


for($i=0; $i -lt $limit;$i++)
{
#Matching criteria for search
    $description = $logContent[$i].description
    $id =  $logContent[$i].id
    $time =  $logContent[$i].time
    $ipaddress =  $logContent[$i].'IP Address'
    $name =  $logContent[$i].'Host Name'
    $macaddress = $logContent[$i].'MAC Address'
    if(($id -eq "10") -or ($id -eq "11") -or ($id -eq "20") -or ($id -eq "21") -or ($id -eq "36")){
            #Write-Host $time","$id","$description","$ipaddress","$name","$macaddress 
#Format the text as you want it!
            $leaseInfo += $time+","+$id+","+$description+","+$ipaddress+","+$name+","+$macaddress
            
        }
}

$destinationFolder = $destinationFolder + "\" + $filename +".log"

#Create a file with days date and place in Powershell script folder

New-Item -ItemType file -Path $destinationFolder -Force


foreach($leaseInfo in $leaseInfo){
Add-Content -Path $destinationFolder -Value $leaseInfo
}

}

Now there are a couple of things worth mentioning here. In the script above you can see a comment “#Format the text as you want it!”. There are at least a couple of ways this data can be inserted in to ELK or Splunk if you use that. You can either forward events as syslog events or you can just copy the entire logs to a fileshare, where these logs can be picked up by Logstash in order to ingest them into Elastic Search. For our use we have chose the filesharing method as it is not important to get live data all the time, but rather just have the data available to analyse it. Furthermore, Operations Manager performs a quite decent job monitoring the health of DHCP service.

The Powershell script provided earlier creates this folder structure on the C: drive.

DHCP monitoring file structure
DHCP monitoring file structure

Each folder contains daily DHCP event audit logs.

DHCP monitoring file structure files
DHCP monitoring file structure files

In order to ingest it properly, I have chosen to provide key value pairs. This can easily be adjusted in the Powershell script. The reason for choosing key value pairs instead of Grok expression is that KV filter is much much more efficient and less resource intensive in Logstash. Anyways, first thing is first, so the contents of the audit log files looks like this when they have been copied to a file share which is accessible to ELK

DHCP monitoring file structure files
DHCP monitoring file structure files

Reoccurring script execution

The Powershell script must at least be run once during the day. Ideally before midnight, but as you can see I have chosen to run it all the time with a sleep timer of 30 minutes. This can be decreased all the way down to whatever you want, but basically for us, if we do not capture the few entries that occur during 23:30-00:00 is not that crucial. And this is the worst scenario, as the script might run 23:55 and mange to get along most of the events for the day. Anyways, in order to create a reoccurring schedule you can create a scheduled task or use a software to make this a windows service. I am describing the Windows scheduled task method that we use.

DHCP monitoring and auditing task with ELK
DHCP monitoring and auditing task with ELK

As you can see a very basic task is generated to run Powershell.exe, this in turn runs our Powershell script

DHCP monitoring Task details
DHCP monitoring Task details

the command utilized is

DHCP monitoring scheduled task command
DHCP monitoring scheduled task command

As you can see, it is also possible to create and .exe file based on the Powershell script that can be utilized. We are using this approach

DHCP monitoring and analysis .exe file
DHCP monitoring and analysis .exe file

The last piece of the puzzle is the Logstash filter that you would need to ingest the data into Elastic search in order to make a complete DHCP events auditing solution.

DHCP monitoring with elasticsearch logstash filter ELK
DHCP monitoring with elasticsearch logstash filter ELK

Final thoughts and recommendations

As you can see, the solution is quite easy to implement and due to use of Powershell it is quite customize able. The same solution can also be used for several other SELM solution like Splunk, however, I personally prefer ELK stack as it provides the flexibility that I have almost never had before with other products. If you have any question or get stuck in the implementation, feel free to drop comments and I will try to respond in a timely manner. The next post will be focusing on ELK to monitoring Palo alto network firewalls or ELK to monitoring DNS so remember to visit back. Thanks for reading if you have made it thus far. The Logstash filter can be downloaded from the link below. And

Leave a Reply

Your email address will not be published. Required fields are marked *