hi, guys
i'm new to this platform and want to do some cisco device monitoring , in my lab i've setted netflow and syslog on asa firewall , and now i can see data from netflow and make dashboards on kibana.
which now perplexed me is that i can find syslog messages in Observabillity--> logs like this:
but there is nothing on dashboard :
[Filebeat Cisco] ASA Firewall
here is the configuration on /etc/filebeat/modules.d/cisco.yml , what's wrong and what i should do more?
- module: cisco
asa:
enabled: true
var.input: syslog
var.syslog_host: 0.0.0.0
var.syslog_port: 9001
var.log_level: 7
#cisco.asa.message_id
#cisco.asa.suffix
#cisco.asa.source_interface
#cisco.asa.destination_interface
#cisco.asa.rule_name
#cisco.asa.source_username
#cisco.asa.destination_username
#cisco.asa.mapped_source_ip
#cisco.asa.mapped_source_host
#cisco.asa.mapped_source_port
#cisco.asa.mapped_destination_ip
#cisco.asa.mapped_destination_host
#cisco.asa.mapped_destination_port
#cisco.asa.threat_level
#cisco.asa.threat_category
#cisco.asa.connection_id
#cisco.asa.icmp_type
#cisco.asa.icmp_code
#cisco.asa.connection_type
#cisco.asa.dap_records
btw , the version is 7.8
the configuration on my firewall
syslog:
logging enable
logging timestamp
logging standby
logging buffer-size 409600
logging console debugging
logging monitor debugging
logging buffered debugging
logging trap debugging
logging history debugging
logging asdm debugging
logging device-id ipaddress inside system
logging host inside 10.226.xx.xx 17/9001
logging debug-trace
netflow
flow-export destination inside 10.226.xx.xx 2055
elasticsearch, kibana, logstash and filebeat are all installed on one server
it seems the pipeline not working
shaunak
(Shaunak Kashyap)
July 6, 2020, 6:34pm
4
Could you please share your complete filebeat.yml
configuration here?
Thanks,
Shaunak
1 Like
hi shaunak
thanks for reply , here is my filebeat.yml configuration
after i turned the output from logstash to elasticsearch , the module working
filebeat.inputs:
- type: log
enabled: true
paths:
- /var/log/*.log
level: debug
review: 1
json.keys_under_root: true
json.overwrite_keys: true
json.add_error_key: true
json.message_key: message
multiline.pattern: ^\[
multiline.negate: false
multiline.match: after
setup.template.settings:
index.number_of_shards: 1
setup.ilm.enabled: auto
setup.ilm.overwrite: true
setup.kibana:
host: "localhost:5601"
output.elasticsearch:
hosts: ["localhost:9200"]
pipeline: geoip-info
processors:
- add_host_metadata: ~
- add_cloud_metadata: ~
- add_docker_metadata: ~
after i turned output to elasticsearch , the dashboard get working now
but i can not find log meessage on this page now , is that a normal behavior ?
oservability --> logs
shaunak
(Shaunak Kashyap)
July 7, 2020, 12:38pm
7
Is that your entire filebeat.yml
file? I was expecting to see a filebeat.config.modules
section as it is responsible for loading up external module configuration files, e.g. /etc/filebeat/modules.d/cisco.yml
. Without that section it would seem that you are not using the Cisco module but instead are ingesting and parsing the logs using manual configuration, which is not ideal.
Also, could you call the Elasticsearch Get Index Template API and post the Filebeat 7.8 template here please? If it's large, please feel free to use pastebin.com or gist.github.com and post the link here instead.
1 Like
shaunak
(Shaunak Kashyap)
July 7, 2020, 1:06pm
8
The cisco/asa
fileset is removing the original message
field in it's ingest pipeline, which is why you don't see it in the Kibana Logs UI. There is an issue to fix this in the future: https://github.com/elastic/beats/issues/14708 .
1 Like
really thanks , i will have a try
system
(system)
Closed
August 5, 2020, 6:17am
10
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.