Elastic Stack shipped dashboards for Kibana

Hi community,

I am fairly new to Elastic Stack (or ELK) and I am a bit confused about the role of Logstash in conjunction with the Beats. Because Beats ships with dashboard you can import easily... But these dashboards uses the different Beats indexes (metric, packet, etc.) by default. If you use Logstash, then everything is indexed in 'logstash-*' by default, or you have to configure Logstash to use different indexes.

You could let the Beats index their data right into Elasticsearch. But then you cannot enrich the data if you want... am I correct?

How would you set this all up when you want to make use of the Beats, Logstash and(!) the dashboards shipped with Beats?

In the case of Metricbeat and Packetbeat, you can send the data directly to Elasticsearch without the need to send it thought Logstash as the data is already structured. In this case, you can use the sample Kibana dashboards as they are, without the need to change the index.

In the case of Filebeat and Winlogbeat, the data is sent in a raw format and it requires using Logstash or Ingest node plugin in Elasticsearch to parse the logs before sending it to Elasticsearch. This is one of the reasons we don't have a sample dashboard for Filebeat, as we cannot know in advance how the data will look after parsing.

We are working on a new feature (Filebeat modules) that comes with out of the box configuration needed to read, parse and visualize data from various log files formats. This includes Ingest Node pipelines, Elasticsearch templates, Filebeat prospectors configurations, and Kibana dashboards.

Thank you for your quick reply and bit of insight of what you all are working on!

Those modules are already present I think in the (e.g.) Metricbeat, right? https://www.elastic.co/guide/en/beats/metricbeat/current/metricbeat-modules.html
So this is also coming for Filebeat, nice!

With those modules for Filebeat, what is the role of Logstash in the Stack is your opinion? Still for enrichment I guess..

Filebeat modules give you a nice getting started experience, but for advanced functionality you would typically introduce Logstash. For example, you would use Logstash when you need a persistent queue before sending data to Elasticsearch, when you want to send the data to other systems besides Elasticsearch, or when you want to enrich your data by getting more information from external sources.

This topic was automatically closed after 21 days. New replies are no longer allowed.