Trying to show a college of mine that visualisation is much easier with Kibana then Splunk, I'm trying to build a use case with Active Directory security logging. But is looks like somewhere in the process I'm doing thing wrong or it is simply not working as expected.
I'm trying to import a evtx file into an ELK stack running on a Linux server. This exported file has been copied to a Windows10 desktop in order to have it ingested via Winlogbeat.
Import the dashboards:
win10> .\winlogbeat.exe setup --dashboards
import the evtx file via Logstash beats:
win10> .\winlogbeat.exe -e -c .\winlogbeat-evtx.yml -E EVTX_FILE=Security.evtx
There is data to explore, but it looks like there are fields missing even though I'me sure the eventid's are present an should be processed by winlogbeat-security.js
Hi,
Could you check your mappings and share it here?
Usually when you can't use aggregation on a field is because the field type isn't keyword.
So check your data mapping and make sure it's using the correct template and mapping.
Hi,
Attached is the mapping from the index winlogbeat.
This is a fresh empty ELK single node Linux system, so how could it end up using a wrong template?
Thanks for your time!
Andre
PS: not able to attach files? Post is limited by number of characters
Take a look at lines 88-95. Your event.action main type is text with a keyword subtype.
As you know, You can't use aggregation on text fields.
Could you share your index template too? because I'm guessing your template mappings is correct and for some reason, your index isn't using your template.
Sorry, I don't know. It is a fresh installation following all the instructions found on the elastic website and that is causing problems because I missed a step or did something wrong? I did not create a dashboard or mapping or template by myself.
Oke, when having a look via Index Management I have an indice winlogbeat-2020-11-18, Looking at Index Templates I see a Legacy index template winlogbeat-7.10.0 which will be applied on index pattern winlogbeat-7.10.0-* . So guess you are absolutely right about not using the template. Question is where to correct this.
Should I correct this in the Logstash config?:
output{
elasticsearch {
hosts => ["http://127.0.0.1:9200"]
index => "%{[@metadata][beat]}-%{+YYYY-MM-dd}"
}
Changed Logstash.conf:
input{
beats{
port => "5044"
}
}
output{
elasticsearch {
hosts => ["http://127.0.0.1:9200"]
index => "%{[@metadata][beat]}-7.10.0-%{+YYYY-MM-dd}"
# index => "%{[@metadata][beat]}-%{+YYYY-MM-dd}"
}
}
Removed al previous data and started all over again.
Dashboards are now definitively looking beter, thanks, but still missing something :
Could not locate that index-pattern-field (id: winlog.logon.id)
Trying to find out what's/why still missing, will let you know or maybe you already have a hint where to look for.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.