Hello Everyone, I hope you are all doing well.
I am sorry to come here with this question but I could really use some experts help to solve an issue I am facing with Logstash.
I will explain my situation as in detail as I can and if I am missing any bits, just please let me know and I will be happy to provide all the answers.
I have a windows virtual machine where I did install log stash,elasticsearch and Grafana.
on this virtual machine I am receiving tons of logs on port 3014, with out of almost 200 column I need only 4/5 columns. Those are the field I am receiving.
@timestamp
message
@version
_id
_index
_source
_type
ad.EventRecordID
ad.Opcode
ad.ProcessID
ad.ThreadID
ad.Version
ad.agentZoneName
ad.analyzedBy
ad.CustomerName
ad.destinationHosts
Just to don't listen all, I will stop on those fields.
What I want to achieve, is to use Logstash to discard all the field that I don't want to see at all.
For example, out of the fields I posted above, I would like to visualise only the following:
@timestamp
ad.customerName
ad.destinationHosts
while all the other I want log stash to discard them.
This approach will help me a lot as I won't have to store and pay for data that is just junk to me (I am hosting data explorer and this VM on azure so I will have to pay for data ingestion etc)
So far I have this log stash config file which it listen and process every incoming data through port 3014
input {
syslog {
port => 3014
codec => cef
syslog_field => "syslog"
grok_pattern => "<%{POSINT:priority}>%{SYSLOGTIMESTAMP:timestamp}"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "logstash_index"
}
}
later on I will change the output as I will be using kusto-pluging
For now I am trying to sort out the input and the filtering.
I am really sorry to bother with this basic stuff, but I love ELK but I am totally new to this and I don't know from where to start.
Thank you very much for your time, patience and help