Filebeat different logfiles to different backends

Hello,

I am trying to use filebeat and send differnet log types but they need to end up in different indexes. My idea was to create different rules on logstash by listning on different ports and have them send every log type to a different port and I am not sure if that is possible.

Any idea?

Regards,

You do any processing in Logstash? If not, you can configure the index to be based on the event when pushing to Elasticsearch.

Using the fields setting in the prospectors, you can add custom fields (e.g. fields.service: service1) define the index name based on the custom field.

Yes, I am using logstash, and the application needs to have it's own indices while the other logs can be put together. That is why I was thinking to send diffferent indices to different logstash backend. If this is not possible via regular filebeat config I might end up cloning the filebeat service and run a different filebeat with different configurations.

You can use fields or tags in filebeat prospectors:

filebeat.prospectors:
- ...
  fields_under_root: true
  fields:
    service: a
  tags: [...]
- ...
  fields_under_root: true
  fields:
    service: b
  tags: [...]

Using these settings in filebeat, you can access the service field in Logstash like any other event field via [service]. Using tags you can filter in Logstash with if "mytag" in [tags] ....

E.g. you can construct the index name using service in LS like: index => "%{[service]}-%{+yyyy.MM.dd}"

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.