Logstash - JSON Parsing is very slow

Hi,

I need to parse elastic search response JSON to CSV using logstash.

JSON File:

{
"took" : 11,
"timed_out" : false,
"_shards" : {
"total" : 3,
"successful" : 3,
"skipped" : 0,
"failed" : 0
},
"hits" : {
"total" : 2434,
"max_score" : null,
"hits" : [
{
"_index" : "index1",
"_type" : "type1",
"_id" : "id",
"_score" : null,
"fields" : {
"field1" : [
"454"
],
"field2" : [
"777"
],
"field3" : [
"6767"
]
}
}
{
"_index" : "index1",
"_type" : "type1",
"_id" : "id2",
"_score" : null,
"fields" : {
"field1" : [
"242"
],
"field2" : [
"434"
],
"field3" : [
"2323"
]
}
}
]
}
}

Logstash configuration:

input{
file{
type => "json"
codec => "json"
path => "/usr/share/logstash/bin/input.json"
start_position => beginning
}
}
filter{
json { source => "message" }

split{
field => "[hits][hits]"
}
}
output {
csv{
fields => ["[hits][hits][fields][field1][0]","[hits][hits][fields][field2][0]","[hits][hits][fields][field3][0]"........]
path => "/usr/share/logstash/bin/output.csv"
}
}

It converts JSON to CSV successfully but it takes too much time to convert. Example: It takes 20 minutes to convert 10000 records.

Am I doing something wrong with logstash configuration?

Thanks,
Ravi

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.