I am completely new to ELK and installed ELK stack on my windows computer.
My goal is to read and analyse log files (which are actually log4j files) sitting on different server.
I am able make logstash read my file in raw format and display it in kibana. But i need to parse the messages in the log files.
For Example : my log looks like this
2017-01-24 15:00:11,471 [36] INFO UpToDate.Editorial.Service.Topic.IcgDataIslandSaveUtil [(null)] - Step 0, Start saving data island. 0.0010006 seconds since last step, 0.0020005 seconds since start.
2017-01-24 15:00:11,489 [36] INFO UpToDate.Editorial.Service.Topic.TopicContentServiceImpl [(null)] - ad01q\sshyamde1 Save Data Island: <?xml version="1.0" encoding="utf-8"?>
I need to parse the message so that I can get the Loglevel , message and class name as a different entity.
I tried to accomplish the same using Grok filter , but I get grok compile error.
Please help me if anybody can provide me the Grok filter that will give me what I want.
Thank you very much for your prompt reply, but before i go through the document , I want to know if I am on correct path.
I mean is there any other way to parse Log4j messages other than asking developers to change their logging technique to json format.
Thank you Magnus, I have now successfully made my grok filter.
Here is my grok filter created
(%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{HOUR}:%{MINUTE}:%{SECOND},%{NUMBER:line} [%{POSINT:pid}] %{LOGLEVEL:loglevel} %{GREEDYDATA:class} - %{GREEDYDATA:message}\n)
Pasting for in case if anybody would need it or find it helpful
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.