Hi people. I have an ELK server 7.8.0 running OK in testing mode.
I've configured Logstash to listen on UDP/514 for incoming syslog remote events, Logstash doesn't apply ani filter, just pass the input to Elasticsearch. I've create a new custom index called syslog-network-"YYYYMMDD".
When I go to KIbana's Discover section, I can see these type of log (it's just a sample):
When I go to Kibana's SIEM access link, and after that I go to Events section I can't see the syslog events at all (I've created a custom dashboard looking into my custom index and I can see aprox. 50 millons syslog events by day).
This is my Events panel from SIEM, and it doesn't show any syslog event at all, and syslog events are greater than netflow events:
How can I include the custom index in the events panel from SIEM section of Kibana?
Every time I create a custom index, do I have to do any special task/action ? Because I only create an index pattern for each of them, no more actions I take.
Is it relevant the fields of the incoming logs stored in my custom index? Do they show anyway in the SIEM panels? Because I did you told me and the syslog events don't appear in the SIEM. Maybe do I have to apply a Logstash filter to incoming syslog data before send them to Elasticsearch?
If I have indices like syslog-YYYYMMDD and syslog-network-YYYYMMDD, in the SIEM default indices should I have to put syslog-* ? Or syslog-; syslog-network-?
Dear Malte,I appreciate your help, it's very important to me.
You tell me that the logs "show up there", but I've shown you the Discover panel and not the SIEM events panel. I don't see any syslog message on the SIEM panel now.
I'm not an expert at ELK, I'm starting in this fantastic world, so let me ask you:
Do I have to use an ECS filter in Logstash in order to see the syslog events on the SIEM panel ? Or may I use a Logstash syslog plugin instead ?
If you add the syslog-* to the settings then the logs will show up in the SIEM section in Kibana.
This is a Screenshot from my Kibana.
The SIEM from Elastic uses the ECS. If you modify your logs that they meet the ecs then it is usable with the siem.
Good for you is that you have cisco asa firewalls. With this you should use the Filebeat https://www.elastic.co/guide/en/beats/filebeat/master/filebeat-module-cisco.html
Filebeat is quite powerful and is has a module for a lot of common used systems. Logstash itself has no ECS Filter, but it is possible to just grok or dissect the message and then name the fields as needed.
If you go with Filebeat then you should have the best outcome.
Have in mind that you should first know what you want to see in the SIEM and then put the needed data inside, instead of putting data inside and then thinking about what could be done with this data.
If you dont want to use Filebeat, then you should be able to put the syslog-* pattern in the advanced settings and then you can modify the detection rules to meet your fields or queries.
Dear Malte, please let me ask you a last question:
Suppose I need to collect logs (syslog) from different platforms: Cisco routers and switches, Cisco ASA Firewall, generic UPS, Linux and Windows servers, VMware hosts.
What is the best solution for doing this? Maybe Cisco ASA and Cisco routers/switches sending logs to Filebeat cisco module to two different UDP ports, and the other devices sending to Logstash on another UDP port and after that passing thgem to Elasticsearch ? Or what ?
What happens if I point my Cisco ASA to a syslog standar input from Filebeat in place to point it to the ASA option from the Cisco module of Filebeat???
Will I have the same message fields and detection capacity in SIEM? What is the advantage in using Filebeat modules in place of Filebeat syslog standard input???
I think you need a few more basics before building a SIEM.
I give you a small list which you need to read before continuing. You can start on Wikipedia and elastic documentation.
syslog, the Definition of the Format and Protocol.
using stdout of filebeat or Logstash
elasticsearch, Mapping and ECS
kibana, benefits of having fields
Best thing would be to play around and see what happens. Every mistake helps you to understand more.
You will see a lot of Tutorials where they use grok pattern and very long Logstash config files. And the modules have a lot of common used configuration predefined. So you can use it out of the box.
Sorry but I think your question was answered. Try and get some experience.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.