Logstash Netflow Module on Debian - Elasticsearch index not found: netflow-*

I installed a Debian 9 Machine, installed OpenJDK8 and then followed the Quick Start Guide on how to setup Elasticsearch, then Kibana and finally wanted to install Logstash to pump some Netflow data in ELK and see what it can do.

However it seems the documentation for 6.6 is not working for Debian. First, I had some problems because it could not connect to Kibana as I have changed the bind IP from localhost to the public IP of the machine in the kibana.yml. After I have changed the IP to 0.0.0.0 (thanks to a Stackoverflow post), it could connect but received an SSL Error, so I had to set var.kibana.ssl.enabled to false as well.
Finally I was able to start it without any errors using this command line:

./logstash --path.settings=/etc/logstash --modules netflow --setup -M netflow.var.input.udp.port=2055

And my logstash.yml was not modified except for the modules section:

modules:
    - name: netflow
      var.input.udp.port: 2055
      var.kibana.ssl.enabled: false
      var.elasticsearch.enabled: false

But I don't see anything in Kibana. When I go to Management -> Kibana -> Index Patterns I can see that netflow-* was created with a bunch of fields, but there is no Index under Elasticsearch. When I check manully, I can also see that nothing is happening:

root@flowretina:~#  curl -X GET 'http://localhost:9200/_cat/indices?v'
health status index     uuid                   pri rep docs.count docs.deleted store.size   pri.store.size
green  open   .kibana_1 q0FXqwoxQ92_aDykENE9XQ   1   0         91            0     79.6kb         79.6kb

But I can see that the server receives data on port 2055 and that java process is using up a lot of cpu time. I am stuck now - I followed the documentation exactly but it does not work.

root@flowretina:/usr/share/logstash/bin# ./logstash --path.settings=/etc/logstash --modules netflow --setup -M netflow.var.input.udp.port=2055
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2019-02-21T14:42:15,985][INFO ][logstash.config.source.modules] Both command-line and logstash.yml modules configurations detected. Using command-line module configuration to override logstash.yml module configuration.
[2019-02-21T14:42:16,006][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-02-21T14:42:16,019][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.6.1"}
[2019-02-21T14:42:16,744][INFO ][logstash.config.source.modules] Both command-line and logstash.yml modules configurations detected. Using command-line module configuration to override logstash.yml module configuration.
[2019-02-21T14:42:16,857][INFO ][logstash.config.modulescommon] Setting up the netflow module
[2019-02-21T14:42:38,634][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"module-netflow", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2019-02-21T14:42:38,964][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2019-02-21T14:42:39,053][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-02-21T14:42:39,077][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2019-02-21T14:42:39,081][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2019-02-21T14:42:39,116][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2019-02-21T14:42:40,182][INFO ][logstash.filters.geoip   ] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-City.mmdb"}
[2019-02-21T14:42:40,209][INFO ][logstash.filters.geoip   ] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-ASN.mmdb"}
[2019-02-21T14:42:40,211][INFO ][logstash.filters.geoip   ] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-City.mmdb"}
[2019-02-21T14:42:40,212][INFO ][logstash.filters.geoip   ] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-ASN.mmdb"}
[2019-02-21T14:42:40,213][INFO ][logstash.filters.geoip   ] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-City.mmdb"}
[2019-02-21T14:42:40,214][INFO ][logstash.filters.geoip   ] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-ASN.mmdb"}
[2019-02-21T14:42:40,215][INFO ][logstash.filters.geoip   ] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-City.mmdb"}
[2019-02-21T14:42:40,216][INFO ][logstash.filters.geoip   ] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-ASN.mmdb"}
[2019-02-21T14:42:40,340][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"module-netflow", :thread=>"#<Thread:0x3a96ca41 run>"}
[2019-02-21T14:42:40,421][INFO ][logstash.inputs.udp      ] Starting UDP listener {:address=>"0.0.0.0:2055"}
[2019-02-21T14:42:40,478][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:"module-netflow"], :non_running_pipelines=>[]}
[2019-02-21T14:42:40,557][INFO ][logstash.inputs.udp      ] UDP listener started {:address=>"0.0.0.0:2055", :receive_buffer_bytes=>"212992", :queue_size=>"2000"}
[2019-02-21T14:42:40,854][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

On a side note - this only works when you start logstash with the command line - when I start it using service logstash start - it does not write logfiles and does nothing. It seems to ignore my complete configuration

I am stupid. I forgot to open the port on the ufw firewall. tcpdump will show data being received, but it could never reach logstash. Now it works - though I just found out only Netflow v5 is supported. Will have to look into elastiflow now.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.