output-
13:53:40.533 [[main]>worker3] DEBUG logstash.outputs.csv - File, writing event to file. {:filename=>"D:/elasticsearch/excel/mongo3.csv"}
13:53:40.533 [[main]>worker3] DEBUG logstash.outputs.csv - File, writing event to file. {:filename=>"D:/elasticsearch/excel/mongo3.csv"}
13:53:40.549 [[main]>worker3] DEBUG logstash.outputs.csv - File, writing event to file. {:filename=>"D:/elasticsearch/excel/mongo3.csv"}
13:53:40.549 [[main]>worker3] DEBUG logstash.outputs.csv - File, writing event to file. {:filename=>"D:/elasticsearch/excel/mongo3.csv"}
13:53:40.549 [[main]>worker3] DEBUG logstash.outputs.csv - File, writing event to file. {:filename=>"D:/elasticsearch/excel/mongo3.csv"}
13:53:41.526 [[main]<elasticsearch] ERROR logstash.pipeline - A plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::Elasticsearch hosts=>["localhost:9200"], index=>"mongo3", query=>"{\n "query": {\n "bool":{\n\t\t "must":{\n\t\t\t "ma
tch":{\n\t\t "content":"recognition"\n\t\t }\n\t\t\t\t\t}\n\t\t\t\t}\n }\n }", id=>"ec055bcc4495cb3c048db0ccab7a85a61d32de52-1", e
nable_metric=>true, codec=><LogStash::Codecs::JSON id=>"json_00a077e7-c90b-4455-8c0a-8e2aec3f1211", enable_metric=>true, charset=>"UTF-8">, size=>1000, scroll=>"1m", docinfo
=>false, docinfo_target=>"@metadata", docinfo_fields=>["_index", "_type", "_id"], ssl=>false>
Error: [400] {"error":{"root_cause":[{"type":"illegal_argument_exception","reason":"Failed to parse request body"}],"type":"illegal_argument_exception","reason":"Failed to
parse request body","caused_by":{"type":"json_parse_exception","reason":"Unrecognized token 'DnF1ZXJ5VGhlbkZldGNoBQAAAAAAAAMnFjNVazlMeGJOUXUtWTdsR0pROXUyNHcAAAAAAAADJhYzVWs
5THhiTlF1LVk3bEdKUTl1MjR3AAAAAAAAAygWM1VrOUx4Yk5RdS1ZN2xHSlE5dTI0dwAAAAAAAAMpFjNVazlMeGJOUXUtWTdsR0pROXUyNHcAAAAAAAADKhYzVWs5THhiTlF1LVk3bEdKUTl1MjR3': was expecting ('true'
, 'false' or 'null')\n at [Source: org.elasticsearch.transport.netty4.ByteBufStreamInput@40904328; line: 1, column: 457]"}},"status":400}
Exception: Elasticsearch::Transport::Transport::Errors::BadRequest
Stack: D:/elasticsearch/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-5.0.3/lib/elasticsearch/transport/transport/base.rb:201:in __raise_transport_e rror' D:/elasticsearch/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-5.0.3/lib/elasticsearch/transport/transport/base.rb:318:in
perform_request'
D:/elasticsearch/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-5.0.3/lib/elasticsearch/transport/transport/http/faraday.rb:20:in perform_request' D:/elasticsearch/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-5.0.3/lib/elasticsearch/transport/client.rb:131:in
perform_request'
D:/elasticsearch/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/elasticsearch-api-5.0.3/lib/elasticsearch/api/actions/scroll.rb:62:in scroll' D:/elasticsearch/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/logstash-input-elasticsearch-4.0.2/lib/logstash/inputs/elasticsearch.rb:187:in
scroll_request'
D:/elasticsearch/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/logstash-input-elasticsearch-4.0.2/lib/logstash/inputs/elasticsearch.rb:156:in process_next_scroll' D:/elasticsearch/logstash-5.3.0/vendor/bundle/jruby/1.9/gems/logstash-input-elasticsearch-4.0.2/lib/logstash/inputs/elasticsearch.rb:148:in
run'
D:/elasticsearch/logstash-5.3.0/logstash-core/lib/logstash/pipeline.rb:425:in inputworker' D:/elasticsearch/logstash-5.3.0/logstash-core/lib/logstash/pipeline.rb:419:in
start_input'
13:53:41.541 [[main]>worker3] DEBUG logstash.pipeline - filter received {"event"=>{"severity"=>"I", "duration"=>"893", "path"=>"C:\Data\log\filter-mongologs-2017-03-18\m
ongodb3.log", "component"=>"COMMAND", "@timestamp"=>2017-03-31T09:17:34.583Z, "host"=>"Admin-PC", "context"=>"conn274726", "@version"=>"1", "content"=>"command hgthanka.Reco
gnition command: aggregate { aggregate: "Recognition", pipeline: [ { $match: { Status: "Active", Template.Type: { $in: [ "Recognition", "KeyResult", "Objective",
"Milestone", "Award", "Context", "Thanks", "Quick", "Congrats", "Newsfeed", "ProductItem", "PollResult", "GoalKeyResultUpdate" ] }, SuppressInFeed: false,
RecipientMember.Location.hgId: "32a336e0-e27f-11e6-9a31-c1daaeaf8e28" } }, { $group: { _id: "$BatchId", max: { $max: "$ModifiedDate" } } }, { $sort: { max: -1 } }, {
$skip: 0 }, { $limit: 20 }, { $project: { _id: 0, BatchId: "$_id" } } ] } keyUpdates:0 writeConflicts:0 numYields:2284 reslen:1238 locks:{ Global: { acquireCount: { r: 457
4 } }, Database: { acquireCount: { r: 2287 } }, Collection: { acquireCount: { r: 2287 } } } protocol:op_query "}}
13:53:41.604 [[main]>worker3] DEBUG logstash.pipeline - filter received {"event"=>{"severity"=>"I", "duration"=>"899", "path"=>"C:\Data\log\filter-mongologs-2017-03-18\m
ongodb3.log", "component"=>"COMMAND", "@timestamp"=>2017-03-31T09:17:34.552Z, "host"=>"Admin-PC", "context"=>"conn275532", "@version"=>"1", "content"=>"command hgthanka.Reco
gnition command: aggregate { aggregate: "Recognition", pipeline: [ { $match: { Status: "Active", Template.Type: { $in: [ "Recognition", "KeyResult", "Objective",
"Milestone", "Award", "Context", "Thanks", "Quick", "Congrats", "Newsfeed", "ProductItem", "PollResult", "GoalKeyResultUpdate" ] }, SuppressInFeed: false,
RecipientMember.Location.hgId: "8a245490-aa93-11e6-9402-abce09847bec" } }, { $group: { _id: "$BatchId", max: { $max: "$ModifiedDate" } } }, { $sort: { max: -1 } }, {
$skip: 0 }, { $limit: 20 }, { $project: { _id: 0, BatchId: "$_id" } } ] } keyUpdates:0 writeConflicts:0 numYields:2284 reslen:1238 locks:{ Global: { acquireCount: { r: 457
4 } }, Database: { acquireCount: { r: 2287 } }, Collection: { acquireCount: { r: 2287 } } } protocol:op_query "}}
13:53:41.619 [[main]>worker3] DEBUG logstash.pipeline - filter received {"event"=>{"severity"=>"I", "duration"=>"887", "path"=>"C:\Data\log\filter-mongologs-2017-03-18\m
ongodb3.log", "component"=>"COMMAND", "@timestamp"=>2017-03-31T09:17:34.583Z, "host"=>"Admin-PC", "context"=>"conn276785", "@version"=>"1", "content"=>"command hgthanka.Reco
It gives continuous & repeated output.
It gives error of pipeline input elasticsearch stoping.
How to resolve it.I m using ELK version 5.3.0