Index overwritten by another index

Hi,

I'm a new user to ELK and I haven't experience.

I have installed ELK 7.3 on Ubuntu 19.04 x64 and I want to get own Nginx and Fail2Ban log.

Elasticsearch, Logstash, Filebeat and Kibana are installed on the same server.

Filebeat is configured for OutPut to Logstash :

output.logstash:
hosts: ["localhost:5044"]

Nginx Filebeat module is enable.

I use the following conf logstash :

fail2ban.conf :

input {
file {
path => "/var/log/fail2ban.log"
type => "fail2ban"
# start_position => "beginning"
}
}

filter {
if [type] == "fail2ban" {
grok {
match => [ "message", "%{TIMESTAMP_ISO8601:bantime} %{PROG}%{SPACE}([%{POSINT}])?: %{WORD}%{SPACE}[%{GREEDYDATA:jail}] %{WORD:action} %{IPV4:ip}" ]
add_tag => [ "%{action}" ]
named_captures_only => true
}

if "_grokparsefailure" in [tags] {
drop { }
}

date {
match => [ "bantime", "ISO8601" ]
target => "bantime"
}

geoip {
source => "ip"
}
}
}

output {
elasticsearch {
hosts => localhost
index => "f2b-%{+YYYY.MM.dd}"
}
}

nginx.conf :

input {
beats {
port => 5044
host => "0.0.0.0"
}
}

filter {
grok {
match => [ "message" , "%{COMBINEDAPACHELOG}+%{GREEDYDATA:extra_fields}"]
overwrite => [ "message" ]
}
mutate {
convert => ["response", "integer"]
convert => ["bytes", "integer"]
convert => ["responsetime", "float"]
}
geoip {
source => "clientip"
target => "geoip"
add_tag => [ "nginx-geoip" ]
}
date {
match => [ "timestamp" , "dd/MMM/YYYY:HH:mm:ss Z" ]
remove_field => [ "timestamp" ]
}
useragent {
source => "agent"
}
}

output {
elasticsearch {
hosts => localhost
manage_template => false
index => "ngx-%{+YYYY.MM.dd}"
}
}

My probrem :

When a active Logstash fail2ban.conf and i create Index : it's OK
When a active Logstash nginx.conf and i create Index : it's OK

But, when automatic index is created the next day, my index fail2ban is overwrite with nginx index data.

I don't understand

Have you an idea ?

Thank you very munch

I guess the issue here is, that you run 1 Logstash instance with both configs. This would merge your config files to one big pipeline.
What you want to do, is configure Logstash to run with multiple pipelines, one for each config file.

Thank you very munch, it's all right !

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.