Decoding json data under custom field


#1

Hello,
let's say I have 2 software components which send json logs but with conflicting formats (for example, level is an integer for one and a string for the other). Is there any way I can use filebeat to decode json data under different json fields such as comp1_json and comp2_json rather than the default json field used by filebeat?
I was thinking about using automatic discovery so filebeat can detect from which component the logs are coming from. It does seem however to be possible to configure the filebeat log input to achieve this result.
Any hint?
/Vincent


(Pier-Hugues Pellerin) #2

Hello @Vincehood

I am missing a bit of details, but I assume the logs are in two different files?
If its the case you can create 2 different inputs and configure the json field differently.


#3

Hello,
thanks for the reply.
I will try to a bit more specific. We are in a Kubernetes environment with filebeat running as daemon-set.
Let's say we have two containers c1 & c2 producing json logs on stdout in different formats.

c1: {"timestamp": "2018-02-01T12:00:00.002+0100", "level": "error", "message": "something bad happened"}
c2: {"time": "2018-02-01T12:00:00.002+0100", "level": "1", "text-message": "something bad happened"}

Is it possible to configure filebeat to decode the JSON data from the Kubernetes log files in the following way:

json1: {
"timestamp": "2018-02-01T12:00:00.002+0100",
"level": "error",
"message": "something bad happened"
}
and
json2: {
"time": "2018-02-01T12:00:00.002+0100",
"level": "1",
"text-message": "something bad happened"
}
I cannot understand whether I can configure the filebeat json decoding towards different elements (json1 & json2) as above.

I thought this would be a way to prevent these formats from conflicting with each others when indexing in Elastic Search.

Would be great if I could get a sample configuration example.

Regards
/Vincent