Currently I am building a Logging Infrastructure based on ELK.
Status Quo I am collecting network logs via Syslog + Logstash Input, Server Logs via Filebeat + Logstash Input (to have the dynamic for custom prospectors) or Winlogbeat.
As long as this is internal, there is no problem.
But what is the best practice for server, which are standing "outside"? A push from the server to the indexer will not be allowed. The indexer have to pull the data anyhow.
The network devices, for example, only can do the syslog protocol (514). Because of standardization the Server should use the beats, too. X-Pack Security is not available.
Any suggestions about best practices are very welcomed.
Thanks in advance!
Can you place a broker (Redis, RabbitMQ, Kafka, ...) in between that both parties can connect to?
Yes, that was my first thought.
Filebeat (outside) -> Redis/Rabbit/Kafka (outside) <- Logstash (Inside) -> Elastic (Inside).
But what to choose? Redis? Rabbit? Kafka? I don't have any experience with all of them till now
Well, any of them will do the job. It's really up to you.
Are there no best practices? Or no recommendations of elastic, what works best?
I will have to handle many many messages. We are talking about several thousands udp messages per second. I heart that some broker crash at a certain amount of messages.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.