Separate indexes for grok filters

I am wondering how to create separated indexes for different logs fetched into logstash (which were later passed onto elasticsearch), so that in kibana,

In my case, I have a few client servers (each of which is installed with filebeat ) and a centralized log server ( ELK ). Each client server has different kinds of logs. (i.e. one filebeat should read one grok filter and another filebeat should read another grok filter)

i am using ELK 7.6 , i heard that document_type was deprecated . so how to achieve this.

Hi there,

add a tag in your filebeat and in your logstash pipeline separate actions to take with a if...else statement on the tags field (or whatever other field you're putting your tags into).

Hi Fabio,
what tag we have to add in filebeat yml file? i have added
if type ==[xxx] in logstash filter section. so how filebeat will read/direct to that particular tag(xxx) in logstash..

Will it work if in your filebeat.yml you put something like

- add_tags:
    tags: [xxx]
    target: "type"

and in your logstash conf file a check like: if "xxx" in [type] ?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.