Parsing Log4j2 logs from filebeat based on defined Pattern Layout

(Sbienert) #1

I want to parse my logs from Filebeat into Elasticsearch as if there would be an Elasticsearch appender in Log4j2.xml meaning correct fields with their values and not one field with message where every field values are put in as a string.

I already read that probably a grok filter in Logstash would be the way to go. However, I am wondering if there is a method to get a grok filter config based on the defined Log4j2 pattern layout. If not wouldn't this be possible to do and maybe a good idea to somehow extend the product? Or maybe for some people to write a plugin.

Can you help me with a grok filter for this pattern Layout?:
%d{yyyy-MM-dd HH:mm:ss.SSS} ${LOG_LEVEL_PATTERN:-%5p} ${PID:- } --- [%t] %-40.40logger{39} : %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}

Thank you very much

(Magnus Bäck) #2

Why not just a layout that produces JSON? Then you don't have to do any parsing.

(Sbienert) #3

I cannot change the layout. These logs are from products that we also deliver to our customers. So changing the log4j2.xml is not a real option and thats also why I would not like to add a rest appender since I would always have to add this appender and restart the app when we install updated versions on our systems.

(Sbienert) #4

I think I am good doing this with grok by myself so please hav this on hold.

(Rohan Daga) #5

For vidmate me the tutuapp gork is working plex fine so far.


(Sbienert) #6

Hello @magnusbaeck,
Grok works fine for me so far, but some log files have a different layout than others with no specific pattern. My idea was to create a grok filter for all existing patterns until one pattern does not throw a _grokparsefailure anymore. For this I would have to check the number of _grokparsefailures in the [tags] with an if condition and go through all patterns until there is one _grokparsefailure too little.

You have an idea how to this?


(Magnus Bäck) #7

I don't understand. There can't be more than one _grokparsefailure tag. The tags field is essentially a set, i.e. there can't be duplicates.

Why not just use a single grok filter that lists all the grok expressions you think you need and then take care of all messages that fall through and result in a _grokparsefailure tag?

(Sbienert) #8

I did not know that if one match does not result in a grokparsefailure, it skips the others and when it does it goes to the next one. This made it for me. Thanks for your help.

(system) #9

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.