How to create multi indices with multi logs from filebeat?


(alan) #1

Hi,
Recently I use filebeat+logstash+ES (all is 6.x) to colect some logs ,I want diferent log create diferent index,how can I do with it?
I read docs and said document_type was deprecated in 6.x,So I configured with FIELDS in filebeat.yml

filebeat:
prospectors:

  paths:
    - /var/log/cron-20180408
  input_type: log
  fields:
  logs_type: crontab
  • paths:
    - /var/log/messages
    input_type: log
    fields:
    logs_type: syslog

output:
logstash:
hosts: ["192.168.1.100:5044"]

configuration for logstash:

input {
beats {
port => "5044"
}
}
filter {
}

output
{

if [fields][logs_type] == "syslog"
{
elasticsearch
{
hosts => "192.168.1.200:9200"
index => "syslog"
}
}

else if [fields][logs_type] == "crontab"
{
elasticsearch
{
hosts => "192.168.1.200:9200"
index => "crontab"
}

But Logstash could not create seperate indices for logs.
Coud someone help me how to do it?
Thank you very much!


(Pier-Hugues Pellerin) #2

From what I see Logstash should correctly receive the [fields][logs_type] field, you should be able to use string interpolation in the Elasticsearch output definition. This change allows you to remove all the conditionals.

To dynamically change the ES index based on that field.

output {
hosts => "192.168.1.200:9200"
index => "%{[fields][logs_types]"
}

Be warned, creating more index requires shards.


(alan) #3

Thank you every much!


(system) #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.