How processing logs with different structure with logstash

Hi, I'm fairly new to logstash; and I want to ship the logs to elasticsearch but I have different structures of logs in my log file:

such as:
00000078 1 57118 20200708040053402 6408 2 admin IP-AC1FF2CB.io rsession21.exe ( Analytics ) 01F210039n 21.0.58.0 0 QI=57123

00000029 2 9895145 20200723131147294 1 QI=57523

My filter:
filter
{
grok{
match => {
"message" => "%{NUMBER:COL_01}%{SPACE}%{NUMBER:COL_02}%{SPACE}%{NUMBER:COL_03}%{SPACE}%{NUMBER:COL_04}%{SPACE}%{DATA:COL_05}%{SPACE}%{USERNAME:COL_06}%{SPACE}%{USERNAME:COL_07} (?<COL_08>[^{]*)%{SPACE}%{IP:COL_09}%{SPACE}%{NUMBER:COL_10}%{SPACE}%{GREEDYDATA:COL_11}"
}
}
}

I created a filter that able to filtering the first structure, but for the second structure it is not be able, and I receive this log in elasticsearch as a message, cause the filter is not adapted.

Now my question is how to create a standard filter for all logs.

Thank you

If you log entries have a standard prefix then I would normally suggest picking that off with dissect, then using grok against the various formats of the log.

In this case I would consider doing it with a csv filter.

 csv { separator => " " columns => [ "COL_01", "COL_02", "COL_03", "COL_04", "[@metadata][restOfLine]" }

Then use grok with an array of pattern to match [@metadata][restOfLine]

Thanks!! This helps :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.