I am new to the Elastic Stack and just installed and configured ELK 6.2.4. I send logs via filebeat and winlogbeat to Logstash, which is supposed to parse the logs via grok into seperate fields and hand them over to elasticsearch which gives them to Kibana to have them visualized.
However I have a hard time understanding grok and it doesn't work like I want it to. I have these eventlogs each line is euqivalent to one "message" field in Kibana. As they are in German I will translate them too, it's just to give you an idea how the structure is like.
> PD-GenerateJobFile: Jobdatei 4752771201.job wurde erfolgreich erstellt.
Dienst PathSvc Fehler bei Durchlauf: Der Zugriff auf den Pfad "D:\Path\Temp\artwork\v515k303.zz3" wurde verweigert. PD-GenerateJobFile: Jobdatei 4752771201.job wurde erfolgreich unter D:\Jobdatei\4752771201.job gespeichert. PD-Dienst: Entpacktes F42-ZIP im ARCHIV D:\Path\ARCHIV\2018-05-18\4752771201\ gespeichert. PD-xJDFManipulation XJDF: 4752771201_DD.ptk wurde erfolgreich generiert. PD-xJDFManipulation XJDF: 4752771201_DD.ptk wurde erfolgreich in HF: D:\DD_Hotfolder\ verschoben. PD-Dienst: D:\Path\XJDF_47527712_01_B04_ST_250ma_500_1-2.zip in D:\Path\Temp\ entpackt. PD-Dienst: Ptk-File D:\Path\Temp\47527712.ptk umbenannt in 4752771201.ptk PD-GeneratePPJdf: Datei 4752771201.xml wurde erfolgreich nach D:\PP-jdf-Testfolder\ kopiert. PD-SendxJDF: 4752771201.ptk wurde nach D:\Xjdf\ kopiert.
--------------------------------| Neuer F42 Dateneingang: XJDF_47476836_03_A18_ST_70ma_2000_1-2.zip |--------------------------------
Now the English version:
> PD-GenerateJobFile: Jobfile 4752771201.job has been successfully created.
Service PathSvc Error while running: Access to the path "D:\Path\Temp\artwork\v515k303.zz3" has been denied. PD-GenerateJobFile: Jobfile 4752771201.job has been successfully saved in D:\Jobfile\4752771201.job. PD-Service: Unzipped F42-ZIP has been saved in ARCHIVE D:\Path\ARCHIVE\2018-05-18\4752771201\. PD-xJDFManipulation XJDF: 4752771201_DD.ptk has successfully been generated. PD-xJDFManipulation XJDF: 4752771201_DD.ptk has successfully been moved to HF: D:\DD_Hotfolder\. PD-Service: D:\Path\XJDF_47527712_01_B04_ST_250ma_500_1-2.zip unzipped to D:\Path\Temp\. PD-Service: Ptk-File D:\Path\Temp\47527712.ptk renamed to 4752771201.ptk PD-GeneratePPJdf: File 4752771201.xml has succesfully been copied to D:\PP-jdf-Testfolder\. PD-SendxJDF: 4752771201.ptk has been copied to D:\Xjdf\.
--------------------------------| New F42 Input: XJDF_47476836_03_A18_ST_70ma_2000_1-2.zip |--------------------------------
What I want to do:
- I want to have grok filters, to parse each of these log lines correctly into fields.
- I need a field for a ststus which contains the value "Error", if there is an error in a log line. I need this for using it in Kibana and also for triggering an E-Mail alert.
- I want each path in a field.
- I would like to know which service the log line is about (PD-abcdefg)
- I would want a description of what has happened, basically the log message or the error description
I am completely lost, as I am not sure how to do this correctly, I played with it a bit and the fields had the wrong values in it, also I am not sure how to handle spaces and symbols (or how to leave them empty and not include them).
This is as far as I got:
For the log line: --------------------------------| Neuer F42 Dateneingang: XJDF_47476836_03_A18_ST_70ma_2000_1-2.zip |--------------------------------
match => ["message", ".* Neuer %{WORD:file} Dateneingang: %{GREEDYDATA:datei_path} |.*"]
For the log line: PD-Dienst: D:\Path\XJDF_47527712_01_B04_ST_250ma_500_1-2.zip in D:\Path\Temp\ entpackt.
match => ["message", "%{WORD:service}: %{PATH:path1} in %{PATH:path2}"]
Can you please help me out writing these grok filters, or point me into the right direction? I already looked through the official documentation, logz.io, several video tutorials and courses and used the grok patterns recommended to create them.
And yet I am not sure how to apply it to my own use case as this is a windows service logging to the Eventlog I want to log to Kibana and most examples are about Linux applications and have premade patterns.
Also I would like to mention that I tried winlogbeat, which parses the whole message field to a field called "param1" which contains the whole message again.