Server workflow direction

Hello all,
I'm absolutely new to Logstash, Elasticsearch and kibana. My boss asked me to put in place a cluster of logstash and elastic search servers for collecting all the logs we have in our DMZ server.
What I wish to know is:

  • what is the workflow direction? the DMZ servers sends to the logstash servers or viceversa? because we have some firewall in between and the security auditor denies us the firewall opening ports from the external to the internal. in case there's some workaround or something to implement?
  • there is a detailed step-by-step guide to install the cluster and the clients?
  • there's a guide of which ports should be opened?

thank you very much
best regards
Nicola

what is the workflow direction? the DMZ servers sends to the logstash servers or viceversa?

Normally the hosts producing the logs pushes them to a small set of central Logstash servers that processes the logs and sends them to Elasticsearch. If you don't want DMZ hosts making inbound connections you could set up a broker (i.e. Redis or RabbitMQ) in the DMZ that the DMZ hosts connect to, and then the central Logstash servers could connect to the broker and pull the events. That kind of architecture has other benefits too, but it's obviously a bit more complicated.

there is a detailed step-by-step guide to install the cluster and the clients?

All official documentation is at elastic.co. There are many tutorial blog posts etc floating around, but watch out since some of them are outdated or just poorly written in general.

there's a guide of which ports should be opened?

For communication between Logstash instances there is no default port; you get to pick one yourself. If you put a broker in the DMZ the port requirements obviously depend on the broker.

Hi Magnus, thank you for your reply.
The redis or rabbitmq is a sort of workaround or is something that could be easily put in place.
Then there's some documentation from elastic to configure such solution?

Thank you again
Best regards
Nicola