Get datetime from logs

I am trying to get logs to Kibana using logstash and filebeats. However kibana doesn't take data and time (which in the logs) as datetime. Instead in takes as string and cannot change format as well .
Appreciate your help
Here are the configurations

logstash filter

filter {
grok {
match => {
"message" => "(?application_[^/])[^ ] [%{TIMESTAMP_ISO8601:logTime}] %{LOGLEVEL:logLevel} %{GREEDYDATA:LogMessage}" }

sample logs

/yarn/container-logs/application_1621858977521_0151/container_1621858977521_0151_01_000004 [2021-06-28 02:38:10,542] INFO Started daemon with process name: 7796@slave1 (org.apache.spark.executor.CoarseGrainedExecutorBackend)
/yarn/container-logs/application_1621858977521_0151/container_1621858977521_0151_01_000004 [2021-06-28 02:38:10,547] INFO Registered signal handler for TERM (org.apache.spark.util.SignalUtils)
/yarn/container-logs/application_1621858977521_0151/container_1621858977521_0151_01_000004 [2021-06-28 02:38:10,548] INFO Registered signal handler for HUP (org.apache.spark.util.SignalUtils)


Your pattern does not follow the grok syntax.
First, if you want to create a custom pattern, you have to write (?<fiel>patternHere) instead of (?field_patterHere).
Last, if you search [ or ], you have to put backslash in front of it.

Pattern that match the syntax of your logs :

%{NOTSPACE:application} \[%{TIMESTAMP_ISO8601:logTime}\] %{LOGLEVEL:logLevel} %{GREEDYDATA:LogMessage}

For date conversion, look the date filter.

Thank you Cad. Appreciate it

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.