Parse elastic response json into CSV

Hi,

I need to parse elastic search response JSON to CSV using logstash. Here I need only fields parameters to go into CSV file.

{
"took" : 11,
"timed_out" : false,
"_shards" : {
"total" : 3,
"successful" : 3,
"skipped" : 0,
"failed" : 0
},
"hits" : {
"total" : 2434,
"max_score" : null,
"hits" : [
{
"_index" : "index1",
"_type" : "type1",
"_id" : "id",
"_score" : null,
"fields" : {
"field1" : [
"454"
],
"field2" : [
"777"
],
"field3" : [
"6767"
]
}
}
{
"_index" : "index1",
"_type" : "type1",
"_id" : "id2",
"_score" : null,
"fields" : {
"field1" : [
"242"
],
"field2" : [
"434"
],
"field3" : [
"2323"
]
}
}
]
}
}

Your JSON is not valid, since the entries in the array are not separated by a comma. If you fix that you can parse it using a json filter, then split it

split { field => "[hits][hits]" }

and then use a csv output with

 fields => [ "[hits][hits][fields][field1][0]", "[hits][hits][fields][field2][0]", "[hits][hits][fields][field3][0]" ]

Thanks for your quick reply, it works fine and I got expected result.:grinning:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.