I am creating a POC of ELK for analysing windows event logs. I am not getting how to apply filters on these event logs. Is there any pattern defined to be directly used in the Grok filter like for Syslogs?
If not then, how can I define my own regex for the event logs and use them in the logstash filter?
And what's sending the events via TCP? What I'm really getting at is what do the events currently look like? Output from a stdout { codec => rubydebug } output would be useful.
Okay. I don't have time to help you fix the grok expression, but once you extract the timestamp and the other initial fields you should be able to use a csv filter to split the remaining string on the tab characters.
I tried out csv filter but is it not going with the logs as I am not getting fields as per my need. So now I am trying to create custom patterns. I have tried out something like this
ACCNAME (?=.*Account Name:\s\w+\s)
by creating a new pattern file.
I then called it inside the config file patterns_dir => ["/opt/logstash/patterns"]
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.