I am trying to install module Arcsight, netflow etc with currently running services like filebeat, packetbeat, suricata, wazuh but when I am installing Arcsight or netflow module, I am not to insert indexes, dashboard etc in elasticsearch and kibana getting below error.
Any help or guidance will be much appreciated.
ELK Stack Version is "6.6.2"
[2019-04-02T20:28:17,170][ERROR][logstash.modules.kibanaclient] Error when execu ting Kibana client request {:error=>#<Manticore::UnknownException: Unrecognized SSL message, plaintext connection?>}
[2019-04-02T20:28:20,300][ERROR][logstash.modules.kibanaclient] Error when execu ting Kibana client request {:error=>#<Manticore::UnknownException: Unrecognized SSL message, plaintext connection?>}
[2019-04-02T20:28:20,731][ERROR][logstash.config.sourceloader] Could not fetch a ll the sources {:exception=>LogStash::ConfigLoadingError, :message=>"Failed to i mport module configurations to Elasticsearch and/or Kibana. Module: arcsight has Elasticsearch hosts: ["localhost:9200"] and Kibana hosts: ["localhost:5601" ]", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/config/modules_ common.rb:108:in block in pipeline_configs'", "org/jruby/RubyArray.java:1734:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/config/modules_common. rb:54:in pipeline_configs'", "/usr/share/logstash/logstash-core/lib/logstash/co nfig/source/modules.rb:14:inpipeline_configs'", "/usr/share/logstash/logstash- core/lib/logstash/config/source_loader.rb:61:in block in fetch'", "org/jruby/Ru byArray.java:2481:incollect'", "/usr/share/logstash/logstash-core/lib/logstash /config/source_loader.rb:60:in fetch'", "/usr/share/logstash/logstash-core/lib/ logstash/agent.rb:150:inconverge_state_and_update'", "/usr/share/logstash/logs tash-core/lib/logstash/agent.rb:101:in execute'", "/usr/share/logstash/logstash -core/lib/logstash/runner.rb:362:inblock in execute'", "/usr/share/logstash/ve ndor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in block in initia lize'"]} [2019-04-02T20:28:20,753][ERROR][logstash.agent ] An exception happene d when converging configuration {:exception=>RuntimeError, :message=>"Could not fetch the configuration, message: Failed to import module configurations to Elas ticsearch and/or Kibana. Module: arcsight has Elasticsearch hosts: [\"localhost: 9200\"] and Kibana hosts: [\"localhost:5601\"]", :backtrace=>["/usr/share/logsta sh/logstash-core/lib/logstash/agent.rb:157:inconverge_state_and_update'", "/us r/share/logstash/logstash-core/lib/logstash/agent.rb:101:in execute'", "/usr/sh are/logstash/logstash-core/lib/logstash/runner.rb:362:inblock in execute'", "/ usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:2 4:in `block in initialize'"]}
Above 2 error are of ssl then error for import of module configuration.
Well I am not using ssl, I will add parameters for disabling ssl. I want to run netflow, arcsight and other beats in single instance of logstash. Please help in suggesting the way out and sample logstash configuration file
I will be adding logic to this section as well as I am doing some consolidations right now.
Then the output of my route5044 handles the routing to the correct pipeline.
output {
if "postgres" in [fields] {
pipeline {
send_to => postgres_log
}
}
else if "meraki" in [fields] {
pipeline {
send_to => meraki_syslog
}
}
else if "forum" in [fields] {
pipeline {
send_to => forum_help
}
}
else {
pipeline {
send_to => catch_all
}
}
}
Then finally the input for my forum_help looks like this:
input {
pipeline {
address => forum_help
}
}
Quick run down:
On beats I create a virtual pipeline address by creating a field
Route pipeline accepts input
Route pipeline output checks what the virtual pipeline address is and routes it to the correct pipeline.
I'm afraid that I do not have a setup that I can test Netflow or Arcsight with.
The only thing you would have to do is assign a field to your Netflow or Arcsight and it will work.
If you can't add the field through those, then you can move those services to a different port. For example, I am pulling syslog info from Meraki, but Meraki doesn't support me adding custom fields. So I moved Meraki over to a different port. Here is my config for that:
This is my logstash-proxy config, but you should be able to change it to add it to your regular pipeline.
I have all of my Meraki stuff connect to port 5045, and then on the input I add the [fields][meraki] which then my regular pipeline logic takes over.
So moving services that you are unable to add a field, to a different port will allow you to uniquely identify the logs and add in your own fields.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.