How to create multiple index from file beat log input in log-stash config

How to configure multiple log files with different name in one logstash.conf instance.
I am using below configuration , with single files it is working fine ,but multiple file with different index name doesn't work.
FileBeat.yaml:
filebeat.inputs:

filebeat.prospectors:

  • type: log

    Change to true to enable this input configuration.

    enabled: true

    Paths that should be crawled and fetched. Glob based paths.

    paths:

    - /var/log/*.log

    • c:\logfile*
      #- c:\programdata\elasticsearch\logs*

LogStash.conf:

input {
beats {
port => 5044
}
}
filter {

grok {
match => ["message","(?(([0-9]+)-)+ ([0-9]+:)+.*)|%{WORD:LOGLEVEL}|%{WORD:LOGSOURCE}|%{GREEDYDATA:LOGMESSAGE}"]
}
}

output {
elasticsearch {
hosts => ["http://localhost:9200"]
manage_template => false
index => "testlog-%{+YYYY.MM.dd}"

}

}

can you please fix formatting of the configuration files?

filebeat.yml:

filebeat.inputs:

filebeat.prospectors:

- type: log

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    # - /var/log/*.log
     - c:\logfile\*
    #- c:\programdata\elasticsearch\logs\*


  # Optional additional fields. These fields can be freely picked
  # to add additional information to the crawled log files for filtering
  fields:
   level: debug
   review: 1
   forum: true
   type: "logs1"

  ### Multiline options

  # Multiline can be used for log messages spanning multiple lines. This is common
  # for Java Stack Traces or C-Line Continuation

  # The regexp Pattern that has to be matched. The example pattern matches all lines starting with [
  multiline.pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}'

  # Defines if the pattern set under pattern should be negated or not. Default is false.
  multiline.negate: true

LogStash.conf:

Sample Logstash configuration for creating a simple

Beats -> Logstash -> Elasticsearch pipeline.

input {
beats {
port => 5044
}
}
filter {

grok {
match => ["message","(?(([0-9]+)-)+ ([0-9]+:)+.*)|%{WORD:LOGLEVEL}|%{WORD:LOGSOURCE}|%{GREEDYDATA:LOGMESSAGE}"]
}
}

output {
elasticsearch {
hosts => ["http://localhost:9200"]
manage_template => false
index => "teslog-%log-%{+YYYY.MM.dd}"

}

}

Hi Michal_Pristas,

Please find my formatted filebeat and logstash file.

I have a three log file with different name, I want to create index for these three file on the basis of their name or type.

maybe you can try playing with metadata and specifying index like
index => "teslog-%{[@metadata][source]}-%{+YYYY.MM.dd}"
or
index => "teslog-%{[@metadata][_source]}-%{+YYYY.MM.dd}"

I haven't tried this so please let me know if this works

Why we are using 'source' here
I tried this earlier with type and it doesn't work at that time.

I am new on ELK. Can you please try at your end?

Hi Michal_Pristas,

I tried this , but its not working .
here is the index name created on elastic search:

teslog-%{[@metadata][source]}-2019.04.04
or
teslog-%{[@metadata][_source]}-2019.04.04

Hi Team,

Can anyone help e how to configure multiple index from filebeat input?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.