Same information in differents Indexes

Hi all,

I'm super noob with ELK, after 1 month I finally created beautifuls dashboard with useful information to my company. But I have an issue, I tried to find something about this , but didn't find anything.

Situation:
Logstash-server Conf Files:

  • Cisco Asa A
  • Cisco Asa B
  • Cisco Asa C

These 3 config files have differents Output Indexes and Differents Input UDP ports, but the information sent to " Cisco Asa A " is replicated in the indexes of Cisco ASa B & C .
Also the physical device " B " & " C" doesn't have syslogs/netflows configured yet.

Doing a Curl to elasticsearch Im saw the index of " B " " C" growing like " Cisco Asa A " , also when I added the " Asa B " & " Asa C" indexes into Kibana I saw the same info than " Cisco Asa A"

I really don't know what is going on here. Im pretty sure that i'm misunderstood something.

can anyone guide me with this situation ?

Thanks in advance

Welcome to the Elastic community @mrognone! As a start, can you post your Logstash server config here? That'll help us work out what could be going wrong.

If you place multiple configuration files in a directory and point Logstash to it, it will read all of them and concatenate them. You may therefore need to use conditionals to ensure that data is not sent to all configured outputs.

Hi @Joshua_Rich & @Christian_Dahlqvist, Thanks for help me :slight_smile:

@Joshua_Rich, Do you need all config files in /etc/logstash/conf.d ? or i misunderstood you.

@Christian_Dahlqvist I will read about this, maybe there i can find the way to filter the info.

Here is the config file from " Cisco ASa A " the others have the same info , but udp input port ,output index and Host are differents.

input {
udp {
port => 9933
codec => netflow {
versions => [9]
}
}
}

filter {
grok {
match => { "host" => "10.9.254.1" }
}

geoip {
add_tag => [ "GeoIP" ]
database => "/etc/logstash/GeoLiteCity.dat" ### Change me to location of GeoLiteCity.dat file
source => "netflow.ipv4_dst_addr"
}

if [geoip][city_name] == "" { mutate { remove_field => "[geoip][city_name]" } }
if [geoip][continent_code] == "" { mutate { remove_field => "[geoip][continent_code]" } }
if [geoip][country_code2] == "" { mutate { remove_field => "[geoip][country_code2]" } }
if [geoip][country_code3] == "" { mutate { remove_field => "[geoip][country_code3]" } }
if [geoip][country_name] == "" { mutate { remove_field => "[geoip][country_name]" } }
if [geoip][latitude] == "" { mutate { remove_field => "[geoip][latitude]" } }
if [geoip][longitude] == "" { mutate { remove_field => "[geoip][longitude]" } }
if [geoip][postal_code] == "" { mutate { remove_field => "[geoip][postal_code]" } }
if [geoip][region_name] == "" { mutate { remove_field => "[geoip][region_name]" } }
if [geoip][time_zone] == "" { mutate { remove_field => "[geoip][time_zone]" } }

}

output {
stdout { codec => rubydebug }
elasticsearch {
index => "logstash-asa_netflow_ba%{+YYYY.MM.dd}"
hosts => "visualizeitlogmonitvm.viridian.local:9200"

}
}

Thanks In advance.

Hi @Christian_Dahlqvist, how are you ?

I used conditionals as you told me , and now it's working ok.

Thanks a lot !! :slight_smile: