Cisco ASA Logging to Elastic Stack

Hi All,

I'm new to elastic stack and have built x2 dedicated logstash servers and a cluster of 3 elasticsearch servers and kibana install on all of them. I'm not sure if that last part is correct but any advise would be great at the moment.

I have configured my input config file in logstash with separate files 1 for input, another for my filters and a final config file for my output to elasticsearch.

I have a VIP load balance sitting in front of my logstash point to both servers and a VIP pointing to the kibana gui for login.

I am having an issue seeing my logs in kibana I'm not sure if its related to the logstash, elasticsearch or kibana.

I can see my cisco asa logs in logstash and I used a template syslog filter for cisco asa I found and it looks like the script was accepted by the logstash as it started with no errors.

If anyone can please help me I would greatly appreciate it. My business isn't sure of the benefits of using ELK but I have read and seen all the quality it can provide. I'm just not confident with the application and the build.

Im happy to show outputs of conf etc please just ask and also show me what output command I need to use to get details for you. My Linux skills are not great so please bear with me.

Im running version 7.6 on an ubuntu 18.04.4 LTS

Thanks in advance,

Mo

In Kibana index management or monitoring, do you see your index and does it have an ingest rate?

If you see your index there, have you created the index pattern in Kibana? That is needed for "explorer" and Kibana features reading from an index.

Hi Len,

Thank you for getting back to me,

I dont see my index under the kibana > ElasticSearch Management > Index Management

I see other index listed which i think are from my initial install where i installed filebeat on my logstash servers . But i was told i dont need filebeat if im only getting logs from my cisco asa (syslog).

i created an index specific but i dont see anything now.

my logstash input config:

  tcp {
    port => 10514
    host => "0.0.0.0"
    type => syslog
  }
  udp {
    port => 10514
    host => "0.0.0.0"
    type => syslog
  }
}```


my logstash output file.

```output {
  if "_grokparsefailure" in [tags] {
    file {
      path => "/tmp/fail-%{type}-%{+YYYY.MM.dd}.log"
    }
  }

  if "cisco" in [tags] {
    file {
      path => "/tmp/%{type}-%{+YYYY.MM.dd}.log"
    }

    elasticsearch {
      hosts           => ["es_master.example.com:9200", "es.data01.example.com:9200", "es.data02.example.com:9200" ]
      manage_template => false
      index           => "network-%{+YYYY.MM.dd}"
      document_type   => "%{type}"
      document_id     => "%{fingerprint}"
    }
  }
stdout {codec => rubydebug}  
}

I used the following syslog filter from this site

https://secops.one/2018/05/31/asa-logstash-config/

I built my entire elastic stack following the guide from the digitalocean.com on how to installing elastic stack with ubuntu. I configure separate logstash files. i modified the logstash.yml, i modified the elasticsearch.yml but added the cluster option. I haven't made any changes to the kibana except point to the cluster ips.

if you have any suggestions on how i can get my syslog index to appear in kibana that would be great.

I can see logs being sent from my asa and statistic are showing on my load balance on the port. but i cant seem to see if on the elasticsearch or in kibana.

Thanks in advance

Mo

You can use the logstash stats api to see if logstash is really getting input and sending output. Another option is to enable monitoring in logstash, then these stats are in Kibana.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.