Parsing nested json logs iin logstash

I have a json which I need to store in elasticsearch,

Suppose this is the json,

{"name":"luvpreet","tasks":{"task1":"logstash","task2":"elastic"},"time":{"time1":"1week","time2":"2week"}}

And I want fields,

name : luvpreet
task1 : logstash
task2 : elastic
time1 : 1week
time2 : 2week

Will this filter below be enough,

filter{
json {
source => "message"
}
}

?

Please tell me how to do it ?

Is the number of task and time values fixed or will it vary arbitrarily? That is, will there always be task1 and task2 and never anything else?

Yes they will change.

Actually this is an example I made for showing purpose. My actual JSON is something else but in the same format.
Yes, the tasks will change.

The built-in filters for move fields don't have any loop or wildcard support so you'll have to write a small snippet of Ruby code in a ruby filter to move all subfields onto the top level. This isn't the first time the question is asked so you should be able to find further details in the archives.

Okay, I have dropped that idea and now will use a different approach.

Here it is, the logs will be coming from rsyslog in the format below,

 {
   "@timestamp" : "",
    "message" : "34.201.66.248 - - [09/Jun/2017:17:02:19 +0530] "GET /api/v1/services/homepage/?site_id=8 HTTP/1.1" 200 1488 "-" "PostmanRuntime/6.1.6",
    "host" : "alpha",
    "programname"  : "nginx",
    "procid" : "23421"
 }

Can I make different fields of the message ? Like IP, request, response, client ?

And what if the task1 and task2 remain same ? Is it possible then ?

Can I make different fields of the message ? Like IP, request, response, client ?

Sure, parsing HTTP logs is a standard task. There are examples in the Logstash documentation.

And what if the task1 and task2 remain same ? Is it possible then ?

Use the mutate filter's rename option to move fields.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.