Export from elasticsearch to csv problem. Where I am wrong?

Hi have the following logstash config file:
input {
elasticsearch {
hosts => ["localhost"]
index => "myindex"
type => "mytype"
query => '{"query" : { "match_all" : {} }}'
}
}
output {
csv {
fields => [my_fields_array]
path => "export.csv
csv_options => {"col_sep" => "\t" "row_sep" => "\r\n"}
}
}

Opening export.csv i don't get any mapped data fields but only a list of something like:
2016-11-02T07:41:15.917Z %{host} %{message}2016-11-02T07:41:16.708Z %{host}...

Any idea about this?
Thank you

Which version of Logstash? I can't reproduce this with 2.4.0:

$ cat in 
{"a": 1, "b": 2, "c": 3}
$ cat test.config 
input { stdin { codec => json } }
output {
  csv {
    fields => ["a", "b"]
    path => "out"
  }
}
$ /opt/logstash/bin/logstash -f test.config < in
Settings: Default pipeline workers: 8
Pipeline main started
Pipeline main has been shutdown
stopping pipeline {:id=>"main"}
$ cat out 
1,2

I am experiencing the same problem with Logstash 5.0. I tried it out on Logstash 2.1.0 and it works correctly.

My config file

input {
stdin {}
}
filter {
    grok {
        break_on_match => true
        match => [
            "message", "%{DATA:key} %{GREEDYDATA:data}"
        ]
    }
}
output {
stdout {codec => dots}

   stdout {codec => rubydebug}
stdout {}

csv {
    fields => ["key", "data"]
    path => "/tmp/output.csv"
}}

My input to stdin:
A B

Output CSV in Logstash 2.1.0
A,B

Output CSV in Logstash 5.0
2016-11-15T02:01:28.047Z SEA-1201050405 A B

1 Like

logstash 5.0.0

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.