While the ELK cluster is typically used for live monitoring, Winlogbeat can be tweaked to manually send “cold logs,” or old, inactive Windows Event Logs (EVTX) to ELK manually. This functionality allows an analyst to take EVTX files from images of systems collected and utilize the functionality of the ELK stack for their investigations – bypassing the typical use for ELK pertaining to live data monitoring.
This opens the door to using ELK as a post-incident investigative tool for DFIR analysts, rather than just for live analysis. This post goes over a possible way to engage in this process.
WHAT YOU NEED
- A functioning ELK stack (Logstash NOT required)
- Single-Node or Multi-Node are both acceptable
- Ubuntu or CentOS as your ELK Node OS
- Windows Host Machine to send EVTX log files from
- Winlogbeat Data Shipper
The following guide will be based on a Single-Node ELK stack, running Ubuntu 19, that does not utilize Logstash.
DISCLAIMER: These files were created and modified for the purpose of this post. As a result, a “plug-and-play” method of just copying these files and executing the process may not work as intended, whether it be due to my goals in this project or future Winlogbeat updates. My hope is that the information here may help guide you through this process and make it easier to implement within your own environment for you own purposes.
HOW THIS WORKS
Winlogbeat is the data shipper created by Elastic used to send “hot,” or live EVTX files to an ELK stack as they happen. This allows for live monitoring of systems based on recorded events that happen in real-time. For digital forensic investigators, this data is often never hot, but rather cold – collected from images obtained of the systems locked at a specific point in time.
Winlogbeat operates off of the default configuration file “winlogbeat.yml” for all settings and configurations. This program also typically runs as a service on the host machine, operating in the background as it grabs EVTX data upon creation. The setup required for this guide will involve replacing this file with a modified version, as well as utilizing a quick Windows Powershell script to invoke Winlogbeat manually, rather than as a service, using this new YML file.
DOWNLOAD WINLOGBEAT [OPTIONAL]
If not done so already, go to the following link to download the latest version of Winlogbeat.
NOTE: For this guide, I used version 7.3.2.
Download Link: https://www.elastic.co/downloads/beats/winlogbeat
Once downloaded, extract the contents and place the resulting Winlogbeat folder wherever you see fit on your system.
SETUP WINLOGBEAT FOR ELK [OPTIONAL]
Within the extracted Winlogbeat folder, you should see a file labeled “winlogbeat.yml” – this is the default configuration for the service. If you have not previously used Winlogbeat within your ELK cluster before, please go ahead and perform the following steps. If you have already used Winlogbeat within your ELK cluster, feel free to leave “winlogbeat.yml” alone and skip these steps.
Open “winlogbeat.yml” within your favorite text editor. Scroll down to the “Outputs” section and modify the “Hosts” option to resemble the IP of your Elasticsearch instance. For Single-Node clusters, Elasticsearch resides on the same node as the rest of your ELK processes.

Once completed, save the file and then open a PowerShell window as Administrator on your host machine. We want to upload the necessary Winlogbeat template to ELK for proper parsing. Within this window, navigate to your Winlogbeat folder and run the following command:
.\winlogbeat.exe setup -e
You should now be all set to modify Winlogbeat.
CUSTOM WINLOGBEAT YML FILE
Ignoring the default “winlogbeat.yml” file, create a new file within that same directory (should be the root of the Winlogbeat folder) and name it something that makes sense to you. For example, I will name it “winlogbeat-evtx.yml” for the purpose of this post.
Once created, open it with your favorite text editor and add the following:
# EVTX ELK Winlogbeat Configuration File | v1.0
# Zachary Burnham 2019 | @zmbf0r3ns1cs
# Winlogbeat Shipper Settings
winlogbeat.event_logs:
- name: ${EVTX_FILE}
no_more_events: stop
winlogbeat.shutdown_timeout: 60s
winlogbeat.registry_file: evtx-registry.yml
# Allow ELK to see active connections
monitoring.enabled: true
# Add/Drop fields for searching within Kibana
processors:
- add_fields:
target: ''
fields:
client: ${CLIENT}
case_number: ${CASE}
identifier: ${ID}
log_file: ${FILE}
- drop_fields:
fields: ["event.kind", "event.code", "agent.ephemeral_id", "ecs.version"]
# Client Index Creation
setup.ilm.enabled: false
output.elasticsearch.index: '${CASE}-${ELK_CLIENT}-evtx'
setup.template.name: "winlogbeat-7.3.2"
setup.template.pattern: "${CASE}-${ELK_CLIENT}*"
# Output data to Elasticsearch
output.elasticsearch:
hosts: ['http://<ELK-IP>:9200']
So there is a lot going on here, so here is a quick breakdown of what is what:
- You may notice PowerShell variables in this file, such as $EVTX_FILE – these are required and utilized in interaction with the Powershell script I will be covering next in this post.
- The Winlogbeat Registry file (evtx-registry.yml) is created as a way for Winlogbeat to keep track of what files have already been uploaded by path to prevent duplicate uploads. It is also intended to keep a record of what logs within each EVTX file has been uploaded, so if the upload is interrupted it can easily resume again later. This file is created in this configuration file to separate it from the official one used by Winlogbeat during typical use as a service, just as a form of best practice.
- You may notice I have added in custom fields that are determined by Powershell variables – this is so investigators can label uploads by client, case number, etc. If this does not apply to you, feel free to comment out (#) everything below “add_fields,” leaving “drop_fields” un-commented.
- The index creation portion creates an additional index using the Winlogbeat template that we uploaded earlier to ELK based on client name (${CASE}-${ELK_CLIENT}-evtx). For my version of the YML file, a name, client or otherwise, and some form of identifier (Case number, etc.) will be required to create the index to split this data up for easy viewing in Kibana. Again, feel free to modify this as you see fit if this does not apply to you to require whatever you need. This is merely a template for the process to steer you in the right direction.
All settings you see in this custom YML file are legitimate settings that can be used within “winlogbeat.yml,” however striped of all the other “fluff” within the original YML that we don’t need in this case.
EVTX UPLOAD POWERSHELL SCRIPT
Now that we have the instructions created to feed Winlogbeat, we need to utilize a vehicle to make that happen. The easiest way I saw of doing this was through Windows PowerShell scripts (.ps1). Open Windows Powershell ISE on your host machine as Administrator and copy the following code into the editor portion of the program:
# Welcome Banner
Write-Host "ELK EVTX UPLOAD SCRIPT | v1.0"
Write-Host "Zachary Burnham 2019 | @zmbf0r3ns1cs"
Write-Host "[!] If multiple systems, please ensure all logs are local and grouped by system within nested folders." "`n"
# Backend Maintenance
# Check to see if Winlogbeat Registry File from prior uploads exists
$regFile = Test-Path $pwd\winlogbeat\data\evtx-registry.yml
# If it does exists, remove Winlogbeat Registry File
if($regFile -eq $true){Remove-Item -Path $pwd\winlogbeat\data\evtx-registry.yml -Force}
# Get current date for logging
$date = Get-Date -UFormat "%m-%d"
# Ask for path containing desired logs
Do {
Write-Host "Enter target directory path containing EVTX logs or folders grouping them by system (i.e. C:\Users\zburnham\EVTX-Logs)."
$tempPath = Read-Host "Path"
# Check to see if input path exists
$checkPath = Test-Path $tempPath
if($checkPath -eq $false){Write-Host "[!] Directory Path not found. Please check your input and try again." "`n"}
}
Until ($checkPath -eq $true)
Write-Host ""
# Adjust target directory path to have proper syntax for Winlogbeat, if needed
$userPath = $tempPath -replace '/','\'
# Check for nested folders
Write-Host "Do you have nested folders labeled by system within this directory? (Default is NO)"
$nested = Read-Host "(y/n)"
Switch ($nested){
Y {Write-Host ""}
N {Write-Host ""}
Default {
Write-Host ""
Write-Host "[*] Defaulting to no nested folders..." "`n"
}
}
# Perform directory check if nested folders exist
if($nested -eq "y"){
Do {
# Filter for all folders
$folders = Get-ChildItem -Path $userPath -Directory
# Verify Info
Write-Host "The following folders (systems) were detected:"
Write-Host ""
Write-Host $folders
Write-Host ""
Write-Host "Is this the data you wish to upload? (Default is NO)"
$answer = Read-Host "(y/n)"
Switch ($answer){
Y {Write-Host ""}
N {Write-Host ""}
Default {
Write-Host ""
Write-Host "[*] Defaulting to NO..." "`n"
}
}
}
Until ($answer -eq "y")
}
# Ask for Client Name
$tempClient = Read-Host "Enter Client Name (i.e. Burnham_Forensics)"
# Replace Spaces, if any, in name for ELK Index Name
$client = $tempClient -replace '\s','_'
# Convert to Lowercase for ELK Index Name
$elkClient = $client.ToLower()
# Ask for Case Number
$case = Read-Host "Enter Case # (i.e. 20-0101)"
if($nested -eq "n"){
# Ask for Identifier for easier searching in Kibana
Write-Host ""
Write-Host "Enter a searchable identifier or note for this evidence upload (i.e. BURNHAM-W10)"
$ID = Read-Host "Identifier"
}
# Informative Message regarding Index Creation
Write-Host ""
Write-Host "ELK Index: $case-$elkClient-evtx"
Write-Host "[!] If new client, don't forget to add this index for viewing under 'Index Patterns' within Kibana settings." "`n"
Write-Host "[*] Logs for this upload can be found in 'elk-logging' within the root 'ELK-Tools' folder." "`n"
# Nested Folders Code
if($nested -eq "y"){
# Filter for all folders
$folders = Get-ChildItem -Path $userPath -Directory
# Create for loop to cycle through all folders
foreach($folder in $folders){
# Define loop vars
$i = 1
$ID = $folder
$foldersPath = $userPath + "\" + $folder
# Filter for just the .evtx files within selected folder
$dirs = Get-ChildItem -Path $foldersPath -filter *.evtx
$dirsCount = $dirs.Count
# Create for loop to grab all .evtx files within selected folder
foreach($file in $dirs){
# Add shiny progress bar
$percentComplete = ($i / $dirsCount) * 100
Write-Progress -Activity "$i of $dirsCount EVTX files found within $foldersPath sent to ELK" -Status "Uploading $file..." -PercentComplete $percentComplete
$filePath = $foldersPath + "\" + $file
# Execute Winlogbeat w/custom vars
.\winlogbeat\winlogbeat.exe -e -c .\winlogbeat\winlogbeat-evtx.yml -E EVTX_FILE="$filePath" -E CLIENT="$rsmClient" -E CLIENT="$elkClient" -E CASE="$case" -E ID="$ID" -E FILE="$file" 2>&1 >> $pwd\elk-logging\winlogbeat_log_${date}.txt
Sleep 3
$i++
}
}
}
# Single Folder Code
if($nested -eq "n"){
$i = 1
# Filter by EVTX extension
$dirs = Get-ChildItem -Path $userPath -filter *.evtx
$dirsCount = $dirs.Count
# Create for loop to grab all .evtx files within selected folder
foreach($file in $dirs){
# Add shiny progress bar
$percentComplete = ($i / $dirsCount) * 100
Write-Progress -Activity "$i of $dirsCount EVTX files found within $userPath sent to ELK" -Status "Uploading $file..." -PercentComplete $percentComplete
$filePath = $userPath + "\" + $file
# Execute Winlogbeat w/custom vars
.\winlogbeat\winlogbeat.exe -e -c .\winlogbeat\winlogbeat-evtx.yml -E EVTX_FILE="$filePath" -E CLIENT="$client" -E ELK_CLIENT="$elkClient" -E CASE="$case" -E ID="$ID" -E FILE="$file" 2>&1 >> $pwd\elk-logging\winlogbeat_log_${date}.txt
Sleep 3
$i++
}
}
# Show message confirming successful upload
Write-Host "[*] EVTX Upload completed. Use the 'Discover' tab in Kibana to view."
The script may look daunting – however the entire file is commented to provide context as to what is going on. Here are a few points:
- Based on how I wrote it, the script must be run in the Winlogbeat folder’s PARENT DIRECTORY. In this case, I put the Winlogbeat folder in a folder called “ELK-Tools” to help me organize it and prevent a random Winlogbeat folder floating around. This also helps me to identify this version of Winlogbeat has been modified for manual upload.
- The script automatically finds the “evtx-registry.yml” file and deletes it, if it exists from a prior upload. This is done to prevent logs that may be uploaded from the same directory from being prevented from uploading again.
- The script will perform different tasks depending on the folder structure of the provided directory. In this case you can point the script to a folder containing EVTX logs OR a folder containing sub-folders labeled by SYSTEM HOSTNAME to upload multiple systems at once, in a loop.
- This script asks for case number, identifier, etc. for investigators to use. In my version of this script, only client name and case number are necessary – you can feel free to comment out the lines responsible for vars that you don’t need. If it makes more sense, rename $CLIENT to something else, however make sure it stays consistent with the “winlogbeat-evtx.yml” file you created earlier.
- Towards the end of the script, you can see winlogbeat.exe being invoked using the “winlogbeat-evtx.yml” file we created earlier – this is possible as Winlogbeat, natively, accepts command line arguments. All arguments following a “-E” are injected into the “winlogbeat-evtx.yml” file, hence the PowerShell variables.
- NOTE: You will need to modify the lines executing winlogbeat.exe (.\winlogbeat\winlogbeat.exe -e -c .\winlogbeat\winlogbeat-evtx.yml…) to match that of your own folder structure.
- The script also takes the console output for Winlogbeat and throws it into a text file labeled by date within a folder called “elk-logging” in the parent directory for the Winlogbeat folder. This allows for easy troubleshooting if something were to go wrong, and also why I created the ELK-Tools folder for myself in my example.

You should be able to execute this script, follow the on screen prompts, and upload to your ELK cluster. Once uploaded, you will need to go through the typical process of adding the created index to your Kibana Discover page within Settings under “Index Patterns.”
As you can see above, there is reference to a “filebeat” folder – I hope to have a post about manually uploading IIS logs to ELK using Filebeat in the future.
As always, if you have any questions or comments please feel free to add them in the reply section below.
A great guide, there’s surprisingly little support out there for “cold” EVTX ingestion! Rather than use Winlogbeats, I opted for a slightly different approach (https://github.com/S-RM/HELi) – but either way, getting these Event Logs into Elasticsearch is clearly the way to go!
LikeLiked by 1 person
James – I appreciate this, thank you for sharing! I’ll be sure to look at HELi soon, seems pretty neat 😎
LikeLike
Thanks for all your work on this Zach! I recently started using this to bulk ingest event logs. We had to make a couple of changes to get it to iterate through multiple folders, but we got it going. I posted our updated script:
Powershell:
# Welcome Banner
Write-Host “ELK EVTX UPLOAD SCRIPT | v1.0”
Write-Host “Zachary Burnham 2019 | @zmbf0r3ns1cs”
Write-Host “[!] If multiple systems, please ensure all logs are local and grouped by system within nested folders.” “`n”
# Backend Maintenance
# Check to see if Winlogbeat Registry File from prior uploads exists
$regFile = Test-Path $pwd\winlogbeat\data\evtx-registry.yml
# If it does exists, remove Winlogbeat Registry File
if($regFile -eq $true){Remove-Item -Path $pwd\winlogbeat\data\evtx-registry.yml -Force}
# Get current date for logging
$date = Get-Date -UFormat “%m-%d”
# Ask for path containing desired logs
Do {
Write-Host “Enter target directory path containing EVTX logs or folders grouping them by system (i.e. C:\Users\zburnham\EVTX-Logs).”
$tempPath = Read-Host “Path”
# Check to see if input path exists
$checkPath = Test-Path $tempPath
if($checkPath -eq $false){Write-Host “[!] Directory Path not found. Please check your input and try again.” “`n”}
}
Until ($checkPath -eq $true)
Write-Host “”
# Adjust target directory path to have proper syntax for Winlogbeat, if needed
$userPath = $tempPath -replace ‘/’,’\’
# Check for nested folders
Write-Host “Do you have nested folders labeled by system within this directory? (Default is NO)”
$nested = Read-Host “(y/n)”
Switch ($nested){
Y {Write-Host “”}
N {Write-Host “”}
Default {
Write-Host “”
Write-Host “[*] Defaulting to no nested folders…” “`n”
}
}
# Perform directory check if nested folders exist
if($nested -eq “y”){
Do {
# Filter for all folders
$folders = Get-ChildItem -Path $userPath -Directory
# Verify Info
Write-Host “The following folders (systems) were detected:”
Write-Host “”
Write-Host $folders
Write-Host “”
Write-Host “Is this the data you wish to upload? (Default is NO)”
$answer = Read-Host “(y/n)”
Switch ($answer){
Y {Write-Host “”}
N {Write-Host “”}
Default {
Write-Host “”
Write-Host “[*] Defaulting to NO…” “`n”
}
}
}
Until ($answer -eq “y”)
}
# Ask for Client Name
$tempClient = Read-Host “Enter Client Name (i.e. Burnham_Forensics)”
# Replace Spaces, if any, in name for ELK Index Name
$client = $tempClient -replace ‘\s’,’_’
# Convert to Lowercase for ELK Index Name
$elkClient = $client.ToLower()
# Ask for Case Number
$case = Read-Host “Enter Case # (i.e. 20-0101)”
if($nested -eq “n”){
# Ask for Identifier for easier searching in Kibana
Write-Host “”
Write-Host “Enter a searchable identifier or note for this evidence upload (i.e. BURNHAM-W10)”
$ID = Read-Host “Identifier”
}
# Informative Message regarding Index Creation
Write-Host “”
Write-Host “ELK Index: $case-$elkClient-evtx”
Write-Host “[!] If new client, don’t forget to add this index for viewing under ‘Index Patterns’ within Kibana settings.” “`n”
Write-Host “[*] Logs for this upload can be found in ‘elk-logging’ within the root ‘ELK-Tools’ folder.” “`n”
# Nested Folders Code
if($nested -eq “y”){
# Filter for all folders
$folders = Get-ChildItem -Path $userPath -Directory
# Create for loop to cycle through all folders
foreach($folder in $folders){
# Define loop vars
$i = 1
$ID = $folder
$foldersPath = $userPath + “\” + $folder
# Filter for just the .evtx files within selected folder
$dirs = Get-ChildItem -Path $foldersPath -filter *.evtx
$dirsCount = $dirs.Count
# Create for loop to grab all .evtx files within selected folder
foreach($file in $dirs){
# Add shiny progress bar
$percentComplete = ($i / $dirsCount) * 100
Write-Progress -Activity “$i of $dirsCount EVTX files found within $foldersPath sent to ELK” -Status “Uploading $file…” -PercentComplete $percentComplete
$filePath = $foldersPath + “\” + $file
# Execute Winlogbeat w/custom vars
.\winlogbeat\winlogbeat.exe -e -c .\winlogbeat\winlogbeat-evtx1.yml -E EVTX_FILE=”$filePath” -E CLIENT=”$client” -E ELK_CLIENT=”$elkClient” -E CASE=”$case” -E ID=”$ID” -E FILE=”$file” 2>&1 >> $pwd\elk-logging\winlogbeat_log_${date}.txt
Sleep 3
$i++
}
}
}
# Single Folder Code
if($nested -eq “n”){
$i = 1
# Filter by EVTX extension
$dirs = Get-ChildItem -Path $userPath -filter *.evtx
$dirsCount = $dirs.Count
# Create for loop to grab all .evtx files within selected folder
foreach($file in $dirs){
# Add shiny progress bar
$percentComplete = ($i / $dirsCount) * 100
Write-Progress -Activity “$i of $dirsCount EVTX files found within $userPath sent to ELK” -Status “Uploading $file…” -PercentComplete $percentComplete
$filePath = $userPath + “\” + $file
# Execute Winlogbeat w/custom vars
.\winlogbeat\winlogbeat.exe -e -c .\winlogbeat\winlogbeat-evtx1.yml -E EVTX_FILE=”$filePath” -E CLIENT=”$client” -E ELK_CLIENT=”$elkClient” -E CASE=”$case” -E ID=”$ID” -E FILE=”$file” 2>&1 >> $pwd\elk-logging\winlogbeat_log_${date}.txt
Sleep 3
$i++
}
}
# Show message confirming successful upload
Write-Host “[*] EVTX Upload completed. Use the ‘Discover’ tab in Kibana to view.”
winlogbeat.yml
# EVTX ELK Winlogbeat Configuration File | v1.0
# Zachary Burnham 2019 | @zmbf0r3ns1cs
# Winlogbeat Shipper Settings
winlogbeat.event_logs:
– name: ${EVTX_FILE}
no_more_events: stop
winlogbeat.shutdown_timeout: 60s
winlogbeat.registry_file: evtx-registry.yml
# Allow ELK to see active connections
monitoring.enabled: true
# ====================== Elasticsearch template settings =======================
setup.template.settings:
index.number_of_shards: 1
#index.codec: best_compression
#_source.enabled: false
# =================================== Kibana ===================================
# Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
# This requires a Kibana endpoint configuration.
setup.kibana:
# Kibana Host
# Scheme and port can be left out and will be set to the default (http and 5601)
# In case you specify and additional path, the scheme is required: http://localhost:5601/path
# IPv6 addresses should always be defined as: https://%5B2001:db8::1]:5601
host: “172.17.1.15:5601″
# Kibana Space ID
# ID of the Kibana Space into which the dashboards should be loaded. By default,
# the Default Space will be used.
#space.id:
# Add/Drop fields for searching within Kibana
processors:
– add_fields:
target: ”
fields:
clientID: ${CLIENT}
case_number: ${CASE}
identifier: ${ID}
log_file: ${FILE}
– drop_fields:
fields: [“event.kind”, “event.code”, “agent.ephemeral_id”, “ecs.version”]
# Client Index Creation
setup.ilm.enabled: false
output.elasticsearch.index: ‘${CASE}-${ELK_CLIENT}-evtx’
setup.template.name: “winlogbeat”
setup.template.pattern: “${CASE}-${ELK_CLIENT}*”
# Output data to Elasticsearch
output.elasticsearch.hosts: [“172.17.1.15:9200”]
LikeLiked by 1 person