Collect logs from dmz

Hello all,
I'm totally new about logstash. I want to install ELK on a local server (behind our firewall) and collect data from the DMZ. the data should be picked by logstash and not pushed by the front server. I read that I need or rabbitmq or redis but I don't know how to install them and integrate into logstash. In addition to that, I wish to know if I need a logstash instance in front of the firewall that will be the "aggregator" where the logstash behind the firewall will connect to.
someone could help me?
I wish to use the logstash 5.0 if possible.
thank you very much
best regards
Nick

I read that I need or rabbitmq or redis but I don't know how to install them and integrate into logstash.

Could you be more specific? Installation procedures of Redis and RabbitMQ are better covered elsewhere and the redis and rabbitmq inputs should be fairly straight-forward to set up.

I wish to know if I need a logstash instance in front of the firewall that will be the "aggregator" where the logstash behind the firewall will connect to.

The Logstash instance inside the firewall would typically connect to the broker (e.g. Redis or RabbitMQ). Whether you need any Logstash instances in the DMZ depends on how you'll collect the logs.

Hi Magnus,
thank you for your answer.
ok, so I try to explain better: we have a bunch of webservers with tomcat that we want to collect all the logs. right now all the server log by themselves the logs. I'm installing in our staging environment a logstash server that will grab all these logs directly from the internal lan. we have to open firewall ports on selected ports only from the internal to the dmz.
what I need to know is: should I have a server that collects all the tomcat logs in front of the firewall (that should be our preferred way)? what should I install on that server? Only Redis or RabbitMQ or also logstash? How should I configure the internal logstash to go to the external server to take the logs (afaik logstash receives the logs in a default configuration)?
What would be the requirements of the external server?
and then: you say that Redis and RabbitMQ are better defined elsewhere. could you please let me know a site or a bunch of sites that cover the installation/configuration of these brokers in a logstash environment?
thank you very much
best regards
Nicola

we have a bunch of webservers with tomcat that we want to collect all the logs.

And these are in the DMZ?

How should I configure the internal logstash to go to the external server to take the logs (afaik logstash receives the logs in a default configuration)?

Use the redis or rabbitmq input plugin (or whatever broker you choose to use).

could you please let me know a site or a bunch of sites that cover the installation/configuration of these brokers in a logstash environment?

Consult the regular documentation of those pieces of software. They don't need to be set up in a special way to work with Logstash.

Hello Nick,

we habe exaktly the same setup/requirements in our infrastructure.

We installed on each server (apache, tomcat, nginx, ....) which is creating logs filebeat. The decentralized filebeat-clients then send the data to a centralized logstash:

filebeat:

List of prospectors to fetch data.

prospectors:
# Each - is a prospector. Below are the prospector specific configurations
# Paths that should be crawled and fetched. Glob based paths.
# For each file found under this path, a harvester is started.

  # Filter AplicationABC-Tomcat Logs
-
  paths:
    - "/varlog/AplicationABC/tomcat/*"
  input_type: log
  document_type: beat
  fields:
   servertype:  tomcat 	
     application: AplicationABC

output:
logstash:
hosts: ["%ENV_LOGSTASH_SERVER%"]

# configure logstash plugin to loadbalance events between
# configured logstash hosts
#loadbalance: false

The centralized logstash now sends the Data via your opened Port to a elastic instance, i guess in a seperate DMZ.

But to be honest, i'm not an experte in logmanagement. This is just the way we handled it. It's working fine for us.

Regards,
Florian

Hi Florian,
thank you very much for your reply. I just set up the workflow from the DMZ to ELK passing through redis. everything seems work fine.
thank you very much
Nick

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.