recently i added a second input for filebeat. First I was just using syslog and now I am also using netflow as an input.
First I just had one index which called filebeat-* but now with my second input I changed the indicies. For my syslog input i created the index rpz-* and my netflow index is called netflow-*. Everything is working fine, my logstash pipeline is seperating the data and sending it correctly to kibana. But when I try to use the netflow dashboard, kibana is still looking for the filebeat-* index. I already reloaded the dashboards with filebeat --setup dashboards but it is not working as you can see in the screenshot
Hmm, not sure how to adjust it using Filebeat configuration but I'm sure there is (you could check in the Beats discuss forums).
As far as changing the index patterns in Kibana, the easiest way is probably to export all of your saved objects you want associated with the netflow-* index patterns, then open the export in a text editor, find all the references to the filebeat-* index pattern ID, and change them to the ID of your netflow-* index pattern. Then, delete the saved objects in Kibana and re-import the export file.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.