Logstash csv output plugins giving problem

I write output-csv.conf file and run it under bin folder of logstash.

input {
elasticsearch {
hosts => ["localhost:9200"] index => "mongo3"
query => '{
"query": {
"bool":{
"must":{
"match":{
"content":"recognition"
}
}
}
}
}'
}
}
output {
csv {
fields => ["@timestamp","content","component","duration","host"]
path => "D:/elasticsearch/excel/mongo3.csv"
}
}

this is input

output-

13:53:40.533 [[main]>worker3] DEBUG logstash.outputs.csv - File, writing event to file. {:filename=>"D:/elasticsearch/excel/mongo3.csv"}
13:53:40.533 [[main]>worker3] DEBUG logstash.outputs.csv - File, writing event to file. {:filename=>"D:/elasticsearch/excel/mongo3.csv"}
13:53:40.549 [[main]>worker3] DEBUG logstash.outputs.csv - File, writing event to file. {:filename=>"D:/elasticsearch/excel/mongo3.csv"}
13:53:40.549 [[main]>worker3] DEBUG logstash.outputs.csv - File, writing event to file. {:filename=>"D:/elasticsearch/excel/mongo3.csv"}
13:53:40.549 [[main]>worker3] DEBUG logstash.outputs.csv - File, writing event to file. {:filename=>"D:/elasticsearch/excel/mongo3.csv"}
13:53:41.526 [[main]<elasticsearch] ERROR logstash.pipeline - A plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::Elasticsearch hosts=>["localhost:9200"], index=>"mongo3", query=>"{\n "query": {\n "bool":{\n\t\t "must":{\n\t\t\t "ma
tch":{\n\t\t "content":"recognition"\n\t\t }\n\t\t\t\t\t}\n\t\t\t\t}\n }\n }", id=>"ec055bcc4495cb3c048db0ccab7a85a61d32de52-1", e
nable_metric=>true, codec=><LogStash::Codecs::JSON id=>"json_00a077e7-c90b-4455-8c0a-8e2aec3f1211", enable_metric=>true, charset=>"UTF-8">, size=>1000, scroll=>"1m", docinfo
=>false, docinfo_target=>"@metadata", docinfo_fields=>["_index", "_type", "_id"], ssl=>false>
Error: [400] {"error":{"root_cause":[{"type":"illegal_argument_exception","reason":"Failed to parse request body"}],"type":"illegal_argument_exception","reason":"Failed to
parse request body","caused_by":{"type":"json_parse_exception","reason":"Unrecognized token 'DnF1ZXJ5VGhlbkZldGNoBQAAAAAAAAMnFjNVazlMeGJOUXUtWTdsR0pROXUyNHcAAAAAAAADJhYzVWs
5THhiTlF1LVk3bEdKUTl1MjR3AAAAAAAAAygWM1VrOUx4Yk5RdS1ZN2xHSlE5dTI0dwAAAAAAAAMpFjNVazlMeGJOUXUtWTdsR0pROXUyNHcAAAAAAAADKhYzVWs5THhiTlF1LVk3bEdKUTl1MjR3': was expecting ('true'
, 'false' or 'null')\n at [Source: org.elasticsearch.transport.netty4.ByteBufStreamInput@40904328; line: 1, column: 457]"}},"status":400}
Exception: Elasticsearch::Transport::Transport::Errors::BadRequest
Stack: D:/elasticsearch/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-5.0.3/lib/elasticsearch/transport/transport/base.rb:201:in __raise_transport_e rror' D:/elasticsearch/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-5.0.3/lib/elasticsearch/transport/transport/base.rb:318:inperform_request'
D:/elasticsearch/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-5.0.3/lib/elasticsearch/transport/transport/http/faraday.rb:20:in perform_request' D:/elasticsearch/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-5.0.3/lib/elasticsearch/transport/client.rb:131:inperform_request'
D:/elasticsearch/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/elasticsearch-api-5.0.3/lib/elasticsearch/api/actions/scroll.rb:62:in scroll' D:/elasticsearch/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/logstash-input-elasticsearch-4.0.2/lib/logstash/inputs/elasticsearch.rb:187:inscroll_request'
D:/elasticsearch/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/logstash-input-elasticsearch-4.0.2/lib/logstash/inputs/elasticsearch.rb:156:in process_next_scroll' D:/elasticsearch/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/logstash-input-elasticsearch-4.0.2/lib/logstash/inputs/elasticsearch.rb:148:inrun'
D:/elasticsearch/logstash-5.3.0/logstash-core/lib/logstash/pipeline.rb:425:in inputworker' D:/elasticsearch/logstash-5.3.0/logstash-core/lib/logstash/pipeline.rb:419:instart_input'
13:53:41.541 [[main]>worker3] DEBUG logstash.pipeline - filter received {"event"=>{"severity"=>"I", "duration"=>"893", "path"=>"C:\Data\log\filter-mongologs-2017-03-18\m
ongodb3.log", "component"=>"COMMAND", "@timestamp"=>2017-03-31T09:17:34.583Z, "host"=>"Admin-PC", "context"=>"conn274726", "@version"=>"1", "content"=>"command hgthanka.Reco
gnition command: aggregate { aggregate: "Recognition", pipeline: [ { $match: { Status: "Active", Template.Type: { $in: [ "Recognition", "KeyResult", "Objective",
"Milestone", "Award", "Context", "Thanks", "Quick", "Congrats", "Newsfeed", "ProductItem", "PollResult", "GoalKeyResultUpdate" ] }, SuppressInFeed: false,
RecipientMember.Location.hgId: "32a336e0-e27f-11e6-9a31-c1daaeaf8e28" } }, { $group: { _id: "$BatchId", max: { $max: "$ModifiedDate" } } }, { $sort: { max: -1 } }, {
$skip: 0 }, { $limit: 20 }, { $project: { _id: 0, BatchId: "$_id" } } ] } keyUpdates:0 writeConflicts:0 numYields:2284 reslen:1238 locks:{ Global: { acquireCount: { r: 457
4 } }, Database: { acquireCount: { r: 2287 } }, Collection: { acquireCount: { r: 2287 } } } protocol:op_query "}}
13:53:41.604 [[main]>worker3] DEBUG logstash.pipeline - filter received {"event"=>{"severity"=>"I", "duration"=>"899", "path"=>"C:\Data\log\filter-mongologs-2017-03-18\m
ongodb3.log", "component"=>"COMMAND", "@timestamp"=>2017-03-31T09:17:34.552Z, "host"=>"Admin-PC", "context"=>"conn275532", "@version"=>"1", "content"=>"command hgthanka.Reco
gnition command: aggregate { aggregate: "Recognition", pipeline: [ { $match: { Status: "Active", Template.Type: { $in: [ "Recognition", "KeyResult", "Objective",
"Milestone", "Award", "Context", "Thanks", "Quick", "Congrats", "Newsfeed", "ProductItem", "PollResult", "GoalKeyResultUpdate" ] }, SuppressInFeed: false,
RecipientMember.Location.hgId: "8a245490-aa93-11e6-9402-abce09847bec" } }, { $group: { _id: "$BatchId", max: { $max: "$ModifiedDate" } } }, { $sort: { max: -1 } }, {
$skip: 0 }, { $limit: 20 }, { $project: { _id: 0, BatchId: "$_id" } } ] } keyUpdates:0 writeConflicts:0 numYields:2284 reslen:1238 locks:{ Global: { acquireCount: { r: 457
4 } }, Database: { acquireCount: { r: 2287 } }, Collection: { acquireCount: { r: 2287 } } } protocol:op_query "}}
13:53:41.619 [[main]>worker3] DEBUG logstash.pipeline - filter received {"event"=>{"severity"=>"I", "duration"=>"887", "path"=>"C:\Data\log\filter-mongologs-2017-03-18\m
ongodb3.log", "component"=>"COMMAND", "@timestamp"=>2017-03-31T09:17:34.583Z, "host"=>"Admin-PC", "context"=>"conn276785", "@version"=>"1", "content"=>"command hgthanka.Reco

It gives continuous & repeated output.
It gives error of pipeline input elasticsearch stoping.
How to resolve it.I m using ELK version 5.3.0

Identified as bug: https://github.com/logstash-plugins/logstash-input-elasticsearch/issues/62

Thanks @Andrei_Stefan
I install elasticsearch input plugin of version 4.0.1 and run the same command to load elasticsearch data into csv file but it is giving same output.
I got the continuous and repeating output like above i mentioned.

Is there any upgraded version of logstash-output-csv plugin.
I m using version 3.0.3

The fix will go in es input plugin version 4.0.3.

@Andrei_Stefan Thanks for reply

I tried this also but it giving following error

D:\Elastic\logstash-5.3.0\bin>logstash-plugin install --version 4.0.3 logstash-input-elasticsearch
Validating logstash-input-elasticsearch-4.0.3
Plugin logstash-input-elasticsearch version 4.0.3 does not exist
ERROR: Installation aborted, verification failed for logstash-input-elasticsearch 4.0.3

The error is self explanatory, almost. That version hasn't been released yet.

Then how can i resolve the error of continuous output?
Can u please give some solution for that?/:relaxed:

Try version 4.0.0 of the elasticsearch input.
The issue Andrei linked says the versions greater or equal to 4.0.1 have the bug.

Thanks @guyboertje
I will try it and let you know

@guyboertje Thank You.
It works

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.