Elasticsearch to csv using logstash

Hi All ,

I am trying to export elasticsearch index data to CSV using logstash , Below is my configuration :-

input {
elasticsearch {
hosts => "localhost:9200"
index => "index_name"
query => '
{
"query": {
"match_all": {}
}
}
'
}
}
output {

csv {
# elastic field name
fields => [ "filed1","filed2"]
# This is path where we store output.
path => "/etc/logstash/scrap/index_name.csv"
}

}

However instead of whole data only partial data is coming in csv ,Below is output of log file :-

[INFO ] 2019-08-08 13:27:01.368 [Ruby-0-Thread-22@[main]>worker18: :1] csv - Opening file {:path=>"/etc/logstash/scrap/index_name.csv"}
[INFO ] 2019-08-08 13:30:29.187 [Ruby-0-Thread-26@[main]>worker22: :1] csv - Closing file /etc/logstash/scrap/index_name.csv
[INFO ] 2019-08-08 13:30:30.776 [Ruby-0-Thread-24@[main]>worker20: :1] csv - Opening file {:path=>"/etc/logstash/scrap/index_name.csv"}
[INFO ] 2019-08-08 13:31:56.951 [[main]-pipeline-manager] pipeline - Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0x4b4d65bb run>"}

Multiple opening file and closing file is there and this might be the issue , Please suggest how to fix this issue.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.