ElasticSearch/Logstash input to CSV output mapping question

I'm attempting to read JSON data from ES and convert it to a CSV file, but I'm getting an unexpected result.

I'd like to take selected fields from the JSON document, and add write them to a CSV.

Here's my configuration file for Logstash...

input {
  elasticsearch {
    hosts => ["server:port"]
    size => "100"
    index => "tibcostats-*"
    scroll => "1m"
    docinfo => true
  }
}

filter {
        json {
                source => "message"
        }
}

output {
        csv {
                path => "/Users/jonathan.shelley/Downloads/test1.csv"
                fields => ["inputtype","[message][TibcoStat][Activity][ExecutionCount]"]
        }
        stdout { codec => rubydebug }
}

A sample JSON document that's being received from ES...
{
"environment" => "perf",
"@timestamp" => 2016-11-06T00:55:02.310Z,
"messagetype" => "tibcostats",
"@version" => "1",
"inputtype" => "jms",
"TibcoStat" => {
"Activity" => {
"ExecutionCount" => 235
}
},
"type" => "tibcostats"
}

My CSV output from running this configuration leads to a repeated output of
2016-11-07T08:24:25.543Z %{host} %{message}

Am I missing something obvious? Thanks in advance.

Are you running Logstash 5?

Yes, Logstash 5.0.0

Known bug in 5.0's csv filter.

Many thanks. Will keep an eye on the bug report.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.