Sending filebeat to logstash loses system fields

Hi all,

Have been setting up ELK stack and hitting some issues that's been bugging me for a while.

When I set up FileBeat (RHEL server) to output to ES, everthing seems to work correctly in that I'm able to filter by system.auth.ssh.event etc
However when I update FileBeat to go through Logstash (not filters), it arrives in ES and viewable in Kibana however the message is not being fully matched (not sure how else to explain it). i.e none of the system.* fields are available.

Have trolled through various topics and SO to no avail.

I thought originally it might have been a filter missing on Logstash but noticed there was plugins that I could use and assume this should cover the basic pattern matches.

Is there something obvious I'm missing here?

input.conf

input {
 beats {
  port => 5044
  host => "10.202.11.6"
 }
}

filter.conf

filter {
  if "filebeat" in [tags] {
    mutate {
    add_tag => "DEBUGTAG"
  }
}

output.conf

output {
  if "DEBUGTAG" in [tags]{
    stdout {
      codec => rubydebug
    }
  }
  if "filebeat" in [tags] {
   elasticsearch {
     hosts => ["10.202.11.4:9200"]
     user => "elastic"
     password => "s3cR3t"
     index => "filebeat-%{+YYYY.MM.dd}"
  }
}

filebeat.yml

filebeat.prospectors:
- type: log 
  enabled: true
  paths:
    - /var/log/*.log
filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: true   
setup.template.settings:
tags: [ "filebeat" ]
output.logstash:
  hosts: ["10.202.11.6:5044"]

Cheers,
Kev

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.