Conditional output from filbeat documents logstash to elasticsearch

I hope its not a repeat Question.

I m trying to ingest multiple files from multiple directories via filebeat --> logstash.

in my file beat input config (filebeat.config.inputs: statements) i m adding log_type field for each directory.

myapp1.yml

  • paths:
    • C:/myapp1//.log*
      fields:
      log_type: myapp1

myapp2.yml

  • paths:
    • C:/myapp2//.log*
      fields:
      log_type: myapp2

in logstash filter

if [fields][log_type] == "myapp1" {
grok {
# my grok pattern for myapp1
}
}

if [fields][log_type] == "myapp2" {
grok {
# my grok pattern for myapp2
}
}

My logstash filter is working as i see output in right format as i have set to stdout { codec => rubydebug }

in output i tried this ..

output {

if "_grokparsefailure" not in [tags] {

 if [file][path] =~ "myapp1" {
    elasticsearch {
            hosts => [ "http://localhost:9200" ]
	index => "myapp1-%{+YYYY.MM.dd}"
     }
    }

 if [file][path] =~ "myapp2" {
    elasticsearch {
            hosts => [ "http://localhost:9200" ]
	index => "myapp2-%{+YYYY.MM.dd}"
     }
    }

}

My output is not working. So is if [file][path] =~ "myapp2" right ? i have also tried if "myapp1" in [file][path] and still don't see my indexes,

Any clue ?

I think it should be: if [file][path] =~ /myapp2/ {

i was able to fix it.

its [log][file][path]

Thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.