Hello
I need to get a configuration code for the config file of logstash for filebeat and firewall cisco asa at the same time
please i need help
thanks
Hello
I need to get a configuration code for the config file of logstash for filebeat and firewall cisco asa at the same time
please i need help
thanks
i'm lost for 2 month
i have filebeat working fine and i want to add cisco asa config
need help plez
What do you have so far? What problems do you have with it?
I need to collect logs from a firewall asa , and from a linux machine by filebeat
i know how to configure filebeat but i'm loste in configuring firewall asa logstash
and i dont know how to make them work together at the same time
Thank you
It's unlikely anyone will write this for you, you need to share what you have done so far.
I have 3 files under /conf.d
30-elasticsearch-output.conf
output {
elasticsearch {
hosts => ["localhost:9200"]
sniffing => true
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
10-syslog-filter.conf
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
02-beats-input.conf
input {
beats {
port => 5044
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}
and i want to config logstash to get cisco asa logs
i have this tuto to help http://ict.renevdmark.nl/2015/10/22/cisco-asa-alerts-and-kibana/
and i'm lost how to put this config right
Well, Cisco devices don't speak the Beats protocol, so the first step would be to add an input that they're capable of sending to. Syslog, perhaps? If so there's an example of how to receive syslog messages in the Logstash documentation.
you can close the thread , i solved the problem by adding this config
input {
tcp {
port => 5514
type => syslog
}
udp {
port => 5514
type => syslog
}
}
02-syslog.conf
filter {
if [type] == "syslog" {
if "%ASA-" in [message] {
grok {
match => [
"message", "%{CISCOFW106001}",
"message", "%{CISCOFW106006_106007_106010}",
"message", "%{CISCOFW106014}",
"message", "%{CISCOFW106015}",
"message", "%{CISCOFW106021}",
"message", "%{CISCOFW106023}",
"message", "%{CISCOFW106100}",
"message", "%{CISCOFW110002}",
"message", "%{CISCOFW302010}",
"message", "%{CISCOFW302013_302014_302015_302016}",
"message", "%{CISCOFW302020_302021}",
"message", "%{CISCOFW305011}",
"message", "%{CISCOFW313001_313004_313008}",
"message", "%{CISCOFW313005}",
"message", "%{CISCOFW402117}",
"message", "%{CISCOFW402119}",
"message", "%{CISCOFW419001}",
"message", "%{CISCOFW419002}",
"message", "%{CISCOFW500004}",
"message", "%{CISCOFW602303_602304}",
"message", "%{CISCOFW710001_710002_710003_710005_710006}",
"message", "%{CISCOFW713172}",
"message", "%{CISCOFW733100}"
]
}
syslog_pri { }
geoip {
source => "src_ip"
target => "geoip"
database => "/opt/logstash/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
}
geoip {
database => "/opt/logstash/GeoIPASNum.dat"
source => "src_ip"
}
mutate {
add_field => { "logtype" => "SysLOG" }
add_tag => [ "pre-processed", "Firewall", "ASA" ]
}
}
}
}
99-outputs.conf
output {
elasticsearch {
host => localhost
index => "log-%{type}-%{+yyyyMM}"
}
stdout { codec => rubydebug }
}
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.