Hey,
currently i try to execute an aggregation function and wirte the resul back to an index. Therefore I thought it would be a good idea to do it with the elasticserach input plugin to execute the query.
My configuration:
config.d/test.conf:
input {
elasticsearch {
hosts => ["https://localhost:9200"]
index => "test"
ssl => true
ca_file => '/elk/logstash-6.1.2/config/ca.crt'
query => '{ "query: { "match": { "serviceCode": "Test" } } }'
user => "elastic"
password => "xxxxx"
#schedule => "* * * * *"
}
}
output {
stdout {
codec => "rubydebug"
}
elasticsearch {
hosts => ["https://localhost:9200"]
index => "test-%{+YYYY-MM-dd}"
ssl => true
cacert => '/elk/logstash-6.1.2/config/ca.crt'
user => elastic
password => xxxxx
}
}
The Problem:
if I execute the Query via the curl on the command line: curl -u elastic:xxx --cacert '/elk/logstash-6.1.2/config/ca.crt' -XPOST https://localhost:9200/test -H'Content-Type: application/json' -d'{ "query": { "match": { "serviceCode": "Test" } } }' everythings works and i become a result.
But if i do it with the logstash elasticserach input plugin I got the following error:
[2018-03-13T10:05:31,454][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:main
Plugin: <LogStash::Inputs::Elasticsearch hosts=>["https://localhost:9200"], ssl=>true, ca_file=>"/elk/logstash-6.1.2/config/ca.crt", user=>"elastic", password=>, id=>"8e23d1eb74179097bd3586d920769e0ae59d16adcf229057e3a13cd26e6c9b89", enable_metric=>true, codec=><LogStash::Codecs::JSON id=>"json_d7659cf1-e712-4ffa-b361-b714c70a4556", enable_metric=>true, charset=>"UTF-8">, index=>"test", query=>"{ "query":{ "match": { "serviceCode":\ "Test" } }", size=>1000, scroll=>"1m", docinfo=>false, docinfo_target=>"@metadata", docinfo_fields=>["_index", "_type", "_id"]>
Error: 400 "Bad Request"
Exception: Faraday::ConnectionFailed
Stack: /elk/logstash-6.1.2/vendor/jruby/lib/ruby/stdlib/net/http/response.rb:120:in `error!'