Good query doesn't work in logstash elastic input?

This query run perfectly directly to ElasticSearch but doesn't work in logstash.
Someone could explain to me why ?
Thanks

My query

GET /my.logs/_search?filter_path=hits.total,hits.hits._source&size=9999
{
"query" : {
"constant_score" : {
"filter" : {
"term" : {
"runID" : "50d580be-9c1d-43b9-acf9-362eb793f690"
}
}
}
},
"sort":
{ "timestamp": { "order": "asc" }}
}

I'm trying to put this query in a logstash config file:

input
{
elasticsearch
{
hosts => "localhost:9200"
index => "my.logs"
query => '{
"constant_score" : {
"filter" : {
"term" : {
"runID" : "50d580be-9c1d-43b9-acf9-362eb793f690"
}
}
}
},
"sort":
{ "timestamp": { "order": "asc" }}
'
}
}
filter
{
}
output
{
file
{
path => "d:\Data\metric.txt"
}
}

I received this error:

[2017-11-20T21:20:14,091][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:main
Plugin: <LogStash::Inputs::Elasticsearch hosts=>["localhost:9200"], index=>"my.logs", docinfo=>true, query=>"{\n "constant_score" : {\n "filter" : {\n "term" : {\n "runID" : "50d580be-9c1d-43b9-acf9-362eb793f690"\n }\n }\n }\n },\n "sort": \n { "timestamp": { "order": "asc" }}\n", id=>"27da4c0a8f6642637de8dc9964654acf2551c883e23a2592131fbfd1302891fa", enable_metric=>true, codec=><LogStash::Codecs::JSON id=>"json_ea726047-64c9-462e-aada-d5ac1e826dd2", enable_metric=>true, charset=>"UTF-8">, size=>1000, scroll=>"1m", docinfo_target=>"@metadata", docinfo_fields=>["_index", "_type", "_id"], ssl=>false>
Error: [400] {"error":{"root_cause":[{"type":"parsing_exception","reason":"Unknown key for a START_OBJECT in [constant_score].","line":2,"col":28}],"type":"parsing_exception","reason":"Unknown key for a START_OBJECT in [constant_score].","line":2,"col":28},"status":400}
Exception: Elasticsearch::Transport::Transport::Errors::BadRequest
Stack: D:/Elastic/logstash-

No body have an idea ? Thanks in advance

I have the exact same error. I am unable to determine from the error what is the pipeline causing the error and exactly why it is mismatching

06/12/2017 00:03:39 Error: [400] {"error":{"root_cause":[{"type":"parsing_exception","reason":"Unknown key for a START_OBJECT in [bool].","line":3,"col":13}],"type":"parsing_exception","reason":"Unknown key for a START_OBJECT in [bool].","line":3,"col":13},"status":400}

ES logs are flooded with this error and it must create a huge impact for my report. Ideally the error would spit out the subject of the parse error to try to reproduce it and confirm what the error is. I dont know how to do this though so i ll try to follow this post for the moment until someone answers

1 Like

Ok the previous error, i have enabled LOG_LEVEL=debug, mentions a specific pipeline I have

[2017-12-08T23:37:00,092][ERROR][logstash.pipeline        ] A plugin had an unrecoverable error. Will restart this plugin.
  Pipeline_id:main
  Plugin: <LogStash::Inputs::Elasticsearch hosts=>["elasticsearch:9200"], index=>"myindex", query=>"\n\t  {\n\t\t  \"bool\": {\n\t\t\t\"must\": [\n\t\t\t  { \"match\": { \"message\": \"query\":  \"WebHookHelper deleting AppID instanceId\", \"type\": \"phrase\" }}\n\t\t\t]\n\t\t  }\n\n\t  }", id=>"53eb1887ad75b4c5e6d069205c78d27900d377eb0b52438be93fb7a3297bdefa", enable_metric=>true, codec=><LogStash::Codecs::JSON id=>"json_1511329a-1c81-4e9d-b3f1-ebe1bdef607f", enable_metric=>true, charset=>"UTF-8">, size=>1000, scroll=>"1m", docinfo=>false, docinfo_target=>"@metadata", docinfo_fields=>["_index", "_type", "_id"], ssl=>false>
  Error: [400] {"error":{"root_cause":[{"type":"parsing_exception","reason":"Unknown key for a START_OBJECT in [bool].","line":3,"col":13}],"type":"parsing_exception","reason":"Unknown key for a START_OBJECT in [bool].","line":3,"col":13},"status":400}
  Exception: Elasticsearch::Transport::Transport::Errors::BadRequest
  Stack: /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/elasticsearch-transport-5.0.4/lib/elasticsearch/transport/transport/base.rb:202:in `__raise_transport_error'
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/elasticsearch-transport-5.0.4/lib/elasticsearch/transport/transport/base.rb:319:in `perform_request'
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/elasticsearch-transport-5.0.4/lib/elasticsearch/transport/transport/http/faraday.rb:20:in `perform_request'
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/elasticsearch-transport-5.0.4/lib/elasticsearch/transport/client.rb:131:in `perform_request'
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/elasticsearch-api-5.0.4/lib/elasticsearch/api/actions/search.rb:183:in `search'
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-elasticsearch-4.1.0/lib/logstash/inputs/elasticsearch.rb:152:in `run'
/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:574:in `inputworker'
/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:567:in `block in start_input'

Is there a misconfiguration for LogStash::Codecs::JSON ?

Here is the query I am making (inside the pipeline)

input {
 elasticsearch {
 hosts => "elasticsearch:9200"
 index => "myindex"
 query => '
 {
 "bool": {
 "must": [
 { "match": { "message": "query": "WebHookHelper deleting AppID instanceId", "type": "phrase" }}
 ]
 }

 }'
 }
}

output {
 csv {
# This is the fields that you would like to output in CSV format.
# The field needs to be one of the fields shown in the output when you run your
# Elasticsearch query
 fields => ["timestamp", "message"]
# This is where we store output. We can use several files to store our output
# by using a timestamp to determine the filename where to store output.
 path => "/tmp/output/deletedApps-%{+YYYY-MM-dd}.csv"
 csv_options => {"col_sep" => "\t" "row_sep" => "\n"}
 }
}

I eventually worked it out. The problem is that the input definition requires

{ "query" ... 

To be put inside the query field

ie

input {
 elasticsearch {
 hosts => "elasticsearch:9200"
 index => "my_index"
 query => '{ "query": { "bool": { "must": [ { "match_phrase": { "message": { "query": "WebHookHelper deleting AppID instanceId" } }} ] } }, "sort": ["@timestamp"] }'
 }
}

This is why it is complaing ... see the query is repeated inside the query field ...

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.