Hello:
I get the following error message when starting logstash. My setup is as follows. Logstash isn't communicating to the Elasticsearch servers because it can't create a pipeline or something. Not sure what I setup wrong.
Server A: running Elasticsearch and Kibana
Server B: running Elasticsearch
Server C: running Logstash
Server D: running Logstash
[2019-08-28T17:44:40,372][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, \", ', -, [, { at line 12, column 11 (byte 150) after output {\n elasticsearch {\n hosts => [\"http://10.3.200.45:9200\", \"http://10.3.200.46:9200\"]\n\tindex => ", :backtrace=>["D:/ELK/Logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "D:/ELK/Logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "D:/ELK/Logstash/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2577:in `map'", "D:/ELK/Logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:151:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in `initialize'", "D:/ELK/Logstash/logstash-core/lib/logstash/java_pipeline.rb:24:in `initialize'", "D:/ELK/Logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in `execute'", "D:/ELK/Logstash/logstash-core/lib/logstash/agent.rb:325:in `block in converge_state'"]}
[2019-08-28T17:44:40,513][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, \", ', -, [, { at line 12, column 11 (byte 150) after output {\n elasticsearch {\n hosts => [\"http://10.3.200.45:9200\", \"http://10.3.200.46:9200\"]\n\tindex => ", :backtrace=>["D:/ELK/Logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "D:/ELK/Logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "D:/ELK/Logstash/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2577:in `map'", "D:/ELK/Logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:151:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in `initialize'", "D:/ELK/Logstash/logstash-core/lib/logstash/java_pipeline.rb:24:in `initialize'", "D:/ELK/Logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in `execute'", "D:/ELK/Logstash/logstash-core/lib/logstash/agent.rb:325:in `block in converge_state'"]}
[2019-08-28T17:44:40,842][INFO ][org.reflections.Reflections] Reflections took 31 ms to scan 1 urls, producing 19 keys and 39 values
[2019-08-28T17:44:40,936][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch bulk_path=>"/_monitoring/bulk?system_id=logstash&system_api_version=7&interval=1s", hosts=>[http://10.3.200.45:9200, http://10.3.200.46:9200], sniffing=>false, manage_template=>false, id=>"39d1206c511d0843ff2a5d0bb67d91624c7352b7ed2d5b904773a46163962bb4", document_type=>"%{[@metadata][document_type]}", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_c300e6c4-0f09-4221-8f57-bfe03305e02c", enable_metric=>true, charset=>"UTF-8">, workers=>1, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>"auto", ilm_rollover_alias=>"logstash", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", action=>"index", ssl_certificate_verification=>true, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2019-08-28T17:44:41,014][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://10.3.200.45:9200/, http://10.3.200.46:9200/]}}
[2019-08-28T17:44:41,030][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://10.3.200.45:9200/"}
[2019-08-28T17:44:41,030][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-08-28T17:44:41,030][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2019-08-28T17:44:41,030][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://10.3.200.46:9200/"}
[2019-08-28T17:44:41,045][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://10.3.200.45:9200", "http://10.3.200.46:9200"]}
[2019-08-28T17:44:41,092][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2, :thread=>"#<Thread:0x2476feef run>"}
[2019-08-28T17:44:41,124][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>".monitoring-logstash"}
[2019-08-28T17:44:41,437][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-08-28T17:44:42,282][INFO ][logstash.javapipeline ] Pipeline terminated {"pipeline.id"=>".monitoring-logstash"}
[2019-08-28T17:44:43,127][INFO ][logstash.runner ] Logstash shut down.
Below is my logstash conf file.
input {
beats {
port => 5045
}
}
output {
elasticsearch {
hosts => ["http://10.3.200.45:9200", "http://10.3.200.46:9200"]
index => %{[@metadata][beat]}-%{[@metadata][version]}
}
}
Thank you in advance