Create Multiple Indexes from filebeat

Hi, I am new to ELK stack. I am trying to set up logs for applications. In total we have 29 applications (deb). running in both (autoscaling and spot-fleet in aws). My idea is to use one filebeat configuration on all the 29 applications, but segregate per the application log location and create indexes. Here is my set up on filebeat.

- type: log
# Change to true to enable this input configuration.
  enabled: true

  paths:
    - /var/log/rahul/*/*/*.log
  fields: {log_type: qa}
  -type: log
  paths:
          - /var/log/rahul/application-1/*/*.log
  fields: {log_type: application-1}

Now in the logstash

input {
  beats {
    port => 5044
    ssl  => false
  }
}
filter {}

output {
if [log_type] == "application-1"{
    elasticsearch {
    hosts => ["0.0.0.0:9200"]
   user => "un"
   password => "pwd"
   index => "application-%{+YYYY.MM.dd}"
   }
   stdout { codec => rubydebug }
  }
if [log_type] == "qa"{
elasticsearch {
hosts => ["0.0.0.0:9200"]
   user => "un"
   password => "pwd"
   index => "qa-%{+YYYY.MM.dd}"
   document_type => "qa_logs"
}
stdout { codec => rubydebug }
}
}

with this setup i don't see indexes are creating in kibana.

Your conditional is wrong, your fields are not being added to the root of the document, but into the top-level field named fields.

You should refer to your field as [fields][log_type].

If you want to add them in the root of your document, you need to change your config, check this part of the documentation.

1 Like

Thank you.

[fields][log_type]

this condition is working indexes are created as expected.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.