Filebeat modules via Logstash

Hi everyone,

I have an issue where when I try and integrate Filebeat PANOS module with Elasticsearch via Logstash it fails to bring across the pre-built dashboards that is does when integrating Filebeat directly with Elasticsearch.

Any assistance would be greatly appreciated.

Thanks

Hi @st1988 Welcome the community and thanks for trying the Elastic Stack.

It can be a bit challenging the first time you enter into the stack we are working to make this easier with our new feet

So here is what I do

Macro I make Filebeat to Elasticsearch Work direct first then I route thought Logstash.... if you can not do that thats ok but that is the best way to check if every thing works before routing through logstash

1st) clean up if you have any filbeat indices as the may be not setup correctly .

2nd) Setup Filebeat and the modules dashboards etc. and run it direct t
Follow the instructions on this page Up THROUGH Step 4

You will need to set and configure the PANOS module
This will setup the index templates, dashboards etc.

If you can (fws etc are good) I would run Filebeat directly to elasticsearch using Step 5 just to see if everything works .... if it does Great, take a look see if see the data correctly dashboards etc... if so go ahead and stop. If you can not run directly fine proceed to step 3

3rd) Setup logstash I will put a working sample beats-logstash.conf below.

So install and setup logstash use the conf below, this will basically act as a pass through but will still using any pipelines, index templates, dashboards etc. You can put this pipeline in the conf.d folder if you have installed via a package manager (deb, rpm etc)

start logstash using the conf below. Data should be flowing.

4th) Go back into filebeat and comment out the Kibana and Elasticsearch output sections and enable the logstash output sections in filebeat and then start filebeat... this should the pump the data via logstash to elasticsearch.

Filebear -> Logstash -> Elasticsearch ....

################################################
# beats->logstash->es default config.
################################################
input {
  beats {
    port => 5044
  }
}

output {
  if [@metadata][pipeline] {
    elasticsearch {
      hosts => "http://localhost:9200"
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][version]}"
      pipeline => "%{[@metadata][pipeline]}" 
      user => "elastic"
      password => "secret"
    }
  } else {
    elasticsearch {
      hosts => "http://localhost:9200"
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][version]}"
      user => "elastic"
      password => "secret"
    }
  }
}

@stephenb - Thank you very much I will give that a go. Just out of interest if I want to add other modules in there do I need to do the same process? Ideally we would send direct to Elasticsearch but need the ability to output to a raw file as well as sending to Elasticsearch which is where Logstash comes in.

Thanks,

Sam

@st1988 You should run setup once again if you added / enabled a new filebeat modules, if you already know that you can run it when you are setting up originally. (there are some subtleties but that is my suggestion)

Am a bit unclear on the output to raw file question, I am not sure what you are asking.

The flow we just spoke of would look like this
PANOS -> Log File -> Filebeat -> Logstash -> Elasticsearch

Yes could run like this, but as you said you do not want this you want to have PANOS write logs
PANOS -> Filebeat -> Logstash -> Elasticsearch

You do not NEED logstash filebeat can read the log and send directly to Easticsearch.
PANOS -> Log File -> Filebeat -> Elasticsearch

And of course if could look like this but you this you want to have PANOS write logs
PANOS -> Filebeat -> Elasticsearch

All of these are valid architectures... Yup lots of options...

@stephenb Ah ok brilliant. So the flow I believe we need is PANOS -> Filebeat (VIA syslog) -> Logstash (Output as raw txt file and Elasticsearch) -> Elasticsearch.

Does that make sense?

Hmmmm yes that should work...

You will need to update the logstash conf to write to a file, what "Raw" means may be up to the interpreter... :slight_smile:

Keep an eye out ... supposedly standard syslog headers are not always standard, if not the logs may not get parsed correctly

Give it a try and let us know.

@stephenb - Thank you soooo much for your assistance this has all worked perfectly!

Thanks,

Sam

1 Like