Hi,
I made two configuration files, one for ASA and one for Fortigate. The problem is that I have duplicate logs on Kibana, how do I solve that?
Hi,
I made two configuration files, one for ASA and one for Fortigate. The problem is that I have duplicate logs on Kibana, how do I solve that?
Please provide more details. What do the files look like? Where are they stored?
I am sending logs using Filebeat and I have done two config files in Logstash, when I send logs, I see each log two times, it's like that the log passe through the two filter.
==> First file.conf
beats {
port => "5044"
}
}
filter {
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
grok { match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{HOSTNAME:hostname} %{DATA:cisco_tag}: %{GREEDYDATA:cisco_message}"}
}
grok {
match => [
"cisco_message", "%{CISCOFW106001}",
"cisco_message", "%{CISCOFW106006_106007_106010}",
"cisco_message", "%{CISCOFW106014}",
"cisco_message", "%{CISCOFW106015}",
"cisco_message", "%{CISCOFW106021}",
"cisco_message", "%{CISCOFW106023}",
"cisco_message", "%{CISCOFW106100}",
"cisco_message", "%{CISCOFW110002}",
"cisco_message", "%{CISCOFW302010}",
"cisco_message", "%{CISCOFW302013_302014_302015_302016}",
"cisco_message", "%{CISCOFW302020_302021}",
"cisco_message", "%{CISCOFW305011}",
"cisco_message", "%{CISCOFW313001_313004_313008}",
"cisco_message", "%{CISCOFW313005}",
"cisco_message", "%{CISCOFW402117}",
"cisco_message", "%{CISCOFW402119}",
"cisco_message", "%{CISCOFW419001}",
"cisco_message", "%{CISCOFW419002}",
"cisco_message", "%{CISCOFW500004}",
"cisco_message", "%{CISCOFW602303_602304}",
"cisco_message", "%{CISCOFW710001_710002_710003_710005_710006}",
"cisco_message", "%{CISCOFW713172}",
"cisco_message", "%{CISCOFW733100}",
"cisco_message", "%{WORD:action} %{WORD:protocol} %{CISCO_REASON:reason} from %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port} to %{DATA:dst_interface}:%{IP:dst_ip}$
"cisco_message", "%{CISCO_ACTION:action} %{WORD:protocol} %{CISCO_REASON:reason}.*(%{IP:src_ip}).*%{IP:dst_ip} on interface %{GREEDYDATA:interface}",
"cisco_message", "Connection limit exceeded %{INT:inuse_connections}/%{INT:connection_limit} for input packet from %{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:$
"cisco_message", "TCP Intercept %{DATA:threat_detection} to %{IP:ext_nat_ip}/%{INT:ext_nat_port}.*(%{IP:int_nat_ip}/%{INT:int_nat_port}).*Average rate of %{INT:syn_av$
"cisco_message", "Embryonic connection limit exceeded %{INT:econns}/%{INT:limit} for %{WORD:direction} packet from %{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:$
]
}
}
output {
elasticsearch {
hosts => [ "localhost:9200"]
}
#stdout { codec => rubydebug }
}
==> Second File.conf
input {
beats {
port => "5044"
}
}
filter {
grok { match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{HOSTNAME:hostname} %{GREEDYDATA:source_data}"}}
kv { source => "data" }
}
output {
elasticsearch {
hosts => [ "localhost:9200"]
}
#stdout { codec => rubydebug }
}
I don't want to make multiple config file so I don't have to to change each time my config file.
The files are stored in : /etc/logstash/conf.d !!
I think I find the solution, I add :
In filter :
fingerprint {
method => "SHA1"
key => "KEY"
}
in Output :
document_id => "%{fingerprint}"
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.