Using Filebeat with multiple indices and logstash

Hi everyone,

i've got a logstash instance that has an beats input on 5044. Im now trying to send filebeat data to that instance (which is working fine on a single index).

I'd like to have multiple indices depending on the used module/import type.

As an example let's say that I'm using the IIS and RabbitMQ modules.

In order not to put too much into a single index I'd like to use a pattern like:

module: rabbitmq
  log:
    enabled: true

    var.paths: ["/.../log/rabbit@*.log*"]
    input:
      ignore_older: 24h
      clean_removed: true
    fields:
      module_type: "rabbitmq"

The idea was to use the variable module_type in the index field.

I was trying to use the logstash output:

output.logstash:
  # Boolean flag to enable or disable the output module.
  enabled: true

  # The Logstash hosts
  hosts: ["mynode.com:5044"]

  # Optional index name. The default index name is set to filebeat
  # in all lowercase.
  index: 'filebeat-test2-%{[agent.version]}'

The example above shows the use of another variable "agent.version" which is taken from the examples of the elastic output and should work fine even if my field has not been set correctly.

When sending the files to logstash it will only create the index "filebeat-test2-%{[agent.version]}-2019.12.11".

It seems as if the logstash output does not resolve the variables.

How can i achieve this without changing my logstash config or using two instances of filebeat?

Regards,

J

Hi!

You are right you cannot set the index like in Elasticsearch. You can see how Logstash output handles index names and how this can be combined with metadata in Logstash config.

The basic difference between these two outputs is that Elasticsearch output has direct access to create indexes, while Logstash output has to pass data through Logstash.

Thanks for the confirmation.

So what i would like to achieve is setting values on a module and using those again in logstash.

My example logstash config:
output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "%{[@metadata][beat]}-%{[module_type]}%{[@metadata][version]}"
}

my module config for rabbitmq looks like this:

module: rabbitmq
  log:
    enabled: true

    var.paths: ["/.../log/rabbit@*.log*"]
    input:
      ignore_older: 24h
      clean_removed: true
    fields:
      module_type: "rabbitmq"

I think i have not yet understood how fields can be added as the (text) output does not show those fields in the json at all.

From what i've researched adding values directly to @metadata is not possible.

In the end i would like to have multiple modules running on a single beat and separating the data in the cluster into multiple indices. Using logstash and not directly connecting to the cluster is a must as is not having configurations for every module in logstash.

How can i achieve this?

Thanks
J

Hi!

Could you try with event-dependent-configuration ? I think this is what you are looking for.

Regards.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.