Logstash not writing to ElasticSearch

I am using the following -> filebeat->logstash->elasticsearch
I am getting 2 types of logs, Apache and Log4J, and trying to create different indexes

output {
if [flow] == "apache" {
elasticsearch {
hosts => ["xx.xx.xx.xx:9200"]
index => "apachelogs-%{+YYYY.MM.dd}"
}
}
else if [flow] == "log4j" {
elasticsearch {
hosts => ["xx.xx.xx.xx:9200"]
index => "jbosslogs-%{+YYYY.MM.dd}"
}
}
}

Does not create any indexes in Elasticsearch, however, there are no errors in logstash or Elasticsearch that I can see.
My filebeats.yml file reads like this

  • type: log

    enabled: true
    paths:

    • /opt/httpd/logs/xxxxx-access*
      fields:
      flow: apache
      application: myapp

Everything was working fine till I added the fork on the output section.
Can anyone suggest what could be the reason? Or something that I am doing wrong above

During development it often helps to output events to stdout with the rubydebug codec so you can look at the structure. If you have indented your Filebeat config file correctly (hard to tell given the formatting) I believe the field should end up under [fields][flow] and not [flow].

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.