Im new using ELK, so I have some doubts in some stuff I´m trying to do. I configured ELK Stack and im trying to ingest some logs to Elasticsearch. One of them are logs from Fortinet Web Application Firewall.
It´s not working. Logstash is not inserting the logs into Elasticsearch. When I run it, I don´t see any errors, but I don´t know why is not working. I don´t know exactly how to do it.
What I should do?
@Badger, I tryed the above code I showed you.
It did not work, I don´t know why. After use it, I can see data in Elasticsearch, but data is not properly formated:
If I use v 009xxxxdate as index, no data is found into Kibana after creating index.
If I use timestamp, the data I see into Kibana has the following format after creating index:
Also I saw this output in console: [2020-03-10T20:49:35,718][WARN ][logstash.codecs.plain ][main] Received an event that has a different character encoding than you configured. {:text=>"v009xxxxdate=2020-02-05 time=09:16:52 log_id=0001 msg_id=00002 device_id=device vd=\\\"root\\\" timezone=\\\"(GMT+1:00)Brussels,Copenhagen,Madrid,Paris\\\" timezone_dayst=\\\"GMTc-1\\\" type=attack pri=alert
Any idea what could be happening?
Thank you very much
I tryed the above code and it does not ingest data to Elasticsearch. I also tryed with kv filter (same code than before), but also no works. Maybe its cause by the format of log.
Any idea?
I don´t see the output, so it´s not working properly... I mean, data is not getting parsed.
I even tryed to ingest it to Elastic, but no index was created.
@Badger do you know how to remap fields into ECS standard?
I´m getting the problem that I don´t see fields into Elastic SIEM even I can see data into Elasticsearch. It´s because data is not in ECS standar, but how to do it?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.