Logstash and multi filebeat index (different)

HI ,

I have set a wazuh ids server and a elk server a part... No problem with logstash and filebeat, no problem with kibana dashboard, but I have in logstash a config for parse filebeat wazuh log... and now I want to add multi filebeat index so as:

  • nginx
  • apache2
  • mysql
  • system

my logstash config for filebeat (wazuh log):

Wazuh - Logstash configuration file

Remote Wazuh Manager - Filebeat input

input {
beats {
port => 5000
codec => "json_lines"
}
}
filter {
geoip {
source => "srcip"
target => "GeoLocation"
fields => ["city_name", "continent_code", "country_code2", "country_name", "region_name", "location"]
}
date {
match => ["timestamp", "ISO8601"]
target => "@timestamp"
}
mutate {
remove_field => [ "timestamp", "beat", "fields", "input_type", "tags", "count", "@version", "log", "offset", "type"]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "wazuh-alerts-%{+YYYY.MM.dd}"
document_type => "wazuh"
template => "/etc/logstash/wazuh-elastic5-template.json"
template_name => "wazuh"
template_overwrite => true
}
}

and now I want to send to logstash filebeat modules apache, nginx, but how do that? Do I need to configure different logstash input beat with different index and document_type or I can do all in one config file?

I need to parse log apache, and nginx -> need grok?

A short answer is to look into using conditionals to select which filters and outputs to use for various kinds of inputs data.

https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.