Grok filter issue with data on Logstah config

Hi Team,
I am facing issue while using grok filter pattern for our log:
here is my log message:
2019-05-09 08:18:41.6499|INFO|Processor|8552|5825|Content|Calling for job id 5776, logid 17586

LogStash.config :

input {
beats {
port => 5044
}
}
filter {

mutate {
add_field => { "[@metadata][index]" => "%{[fields][type]}" }
}

	if [@metadata][index] == "imagelog"{

grok {
match => ["message","(?(([0-9]+)-)+ ([0-9]+:)+.*)|%{WORD:LOGLEVEL}|%{WORD:LOGSOURCE}|%{NUMBER:Id}|%{NUMBER:FileId}|%{DATA:Name}|%{GREEDYDATA:LOGMESSAGE}"]
}
}

	else
{

grok {
        match => ["message","(?<DATETIME>(([0-9]+)-*)+ ([0-9]+:*)+.*)\|%{WORD:LOGLEVEL}\|%{WORD:LOGSOURCE}\|%{GREEDYDATA:LOGMESSAGE}"]
	}
	
}
	
}

In my case else condition working fine, but if condition not return required result
it return below result on kibana:
DATETIME field return: 2019-05-09 08:18:41.6499, 2019-05-09 08:18:41.6499|INFO|Processor|8552
LOGLEVEL field return: INFO, 5825
In LOGLEVEL I just want to return only INFO as per the logmessage.

If you have a message in which the fields are delimited I would recommend using dissect rather than grok.

dissect { mapping => { "message" => "%{ts}|%{loglevel}|%{logsource}|%{id}|%{fileid}|%{name}|%{message}" } }

If the entire set of events is pipe delimited you could also consider using a csv filter.

Thanks Badger,
Today its working fine without any changes.
I'll try with dissect filter also for better performance.

{ts} stand for timestamp?

Yes.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.