Hi all,
Currently I am building a Logging Infrastructure based on ELK.
Status Quo I am collecting network logs via Syslog + Logstash Input, Server Logs via Filebeat + Logstash Input (to have the dynamic for custom prospectors) or Winlogbeat.
As long as this is internal, there is no problem.
But what is the best practice for server, which are standing "outside"? A push from the server to the indexer will not be allowed. The indexer have to pull the data anyhow.
The network devices, for example, only can do the syslog protocol (514). Because of standardization the Server should use the beats, too. X-Pack Security is not available.
Any suggestions about best practices are very welcomed.
Thanks in advance!
Filebeat (outside) -> Redis/Rabbit/Kafka (outside) <- Logstash (Inside) -> Elastic (Inside).
But what to choose? Redis? Rabbit? Kafka? I don't have any experience with all of them till now
Are there no best practices? Or no recommendations of elastic, what works best?
I will have to handle many many messages. We are talking about several thousands udp messages per second. I heart that some broker crash at a certain amount of messages.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.