Multiple indices from one log file

Dear Community!

I have a question regarding filebeat configuration. Is there any method to make two separate index from the same log file? My purpose is to make two separate index for Kibana from the same log file - to use one index to search for full logs (for debugging) and use the other index for reporting tasks (in this case I would use processors to dissect and remove the unnecessary fields from the log)

Something like this:

filebeat.inputs:
  - type: log
    enabled: true
    fields:
      log_type: log-1
    paths:
      - path/to/logfile.log
  - type: log
    enabled: true
    fields:
      log_type: log-2
    paths:
      - path/to/logfile.log

I know that the log file "paths" can't be the same (as seen above), this is why I interested in what can be an alternative solution.

I am using filebeat version 7.12, and Kibana 7.6.2.

Hi @kkovacs

Maybe in logstash you have more flexibility to create one or multiple pipelines, where you can manipulate the log the way you want and also have the log in its raw form.

Multiple Pipelines | Logstash Reference [8.4] | Elastic

best regards

What is your output?

If your output is Elasticsearch you may be able to use the indices configuration.

The documentation has a couple of examples that may fit your use case.

If your output is Logstash, then you should do that in Logstash, which is way easier.

First of all: Thank you guys for the answers.

I do not use Logstash right now. I am trying to solve the problem without using it. My output is Elasticsearch so I am going to check the linked documentation for possible solution.

This is my output right now:

output:
  elasticsearch:
    hosts: [...ip address...]
    index: "%{[fields.log_type]}-%{+yyyy.MM.dd}"

Thanks again.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.