I'm attempting to read JSON data from ES and convert it to a CSV file, but I'm getting an unexpected result.
I'd like to take selected fields from the JSON document, and add write them to a CSV.
Here's my configuration file for Logstash...
input {
elasticsearch {
hosts => ["server:port"]
size => "100"
index => "tibcostats-*"
scroll => "1m"
docinfo => true
}
}
filter {
json {
source => "message"
}
}
output {
csv {
path => "/Users/jonathan.shelley/Downloads/test1.csv"
fields => ["inputtype","[message][TibcoStat][Activity][ExecutionCount]"]
}
stdout { codec => rubydebug }
}
A sample JSON document that's being received from ES...
{
"environment" => "perf",
"@timestamp" => 2016-11-06T00:55:02.310Z,
"messagetype" => "tibcostats",
"@version" => "1",
"inputtype" => "jms",
"TibcoStat" => {
"Activity" => {
"ExecutionCount" => 235
}
},
"type" => "tibcostats"
}
My CSV output from running this configuration leads to a repeated output of
2016-11-07T08:24:25.543Z %{host} %{message}
Am I missing something obvious? Thanks in advance.