Filebeat > Nginx Module

Hi all, i found that this Nginx Module's visualization does not provides the full template. How can I get it and steps to install it?

As when I view the dashboard, some are appeared "Could not locate that index-pattern (id: filebeat-*), click here to re-create it"

hi @skyluke.1987, it looks like the dashboards have been loaded but the index pattern is missing.
Can you check if the filebeat-* index pattern exists or have you configured a custom name for the filebeat index?
These are the usual steps on loading the dashboards

Hi @MarianaD, the index filebeat-* has been loaded, and the name was "filebeat-7.3.0" does this index naming will impact the creation on the dashboard?

Hi I found this suggestion saying we should removed all the "saved objects" and reimport the items again. But my concern in there are other saved objects which are not owned by this filebeat nginx module. How should I do?

Hi can anyone help me on this ?

Firstly, I have difficulties create multiple filebeat files by date. Even I have specify in the Filebeat>modules.d>nginx.yml file with the export filename.

Next, regarding the Nginx module, I am still not able to get the dashboard working.

Did you run filebeat setup after you enabled the nginx module? setup loads the index template, index pattern and all the visualizations etc

Hi yes, I did. Below is a screenshot of it.


Hi MarianaD, what if I created the index pattern name this way filebeat-7.3.0* ? Will this affects the result?

Hi @stephenb, I have this issue when I open the visualization:
How do I fix this?

I notice I was running Filebeat 7.1 instead of the new Filebeat7.3.. How can I choose to run the new one ? I remembered i have installed the new one before.

Hi there, is there anyone encounter this problem and have solution to it?

  1. Can I know how can I print out the data / fields received, does it contain the required fields, e.g. geoip.

  2. How can I control the data fields, and what if there is no field captured, what should I do? Currently the visualization showing empty. Error: "Could not locate that index-pattern-field (id: source.address)" and more.


Apologies but It's a bit difficult to know what state your whole system is in but if you're all on 7.3 then you could try to delete the index pattern in kibana and then run filebeat setup again with the nginx module enabled and it should create the correct index pattern.

However if you have data from old filebeats and new filebeats they may not all work with the new index pattern.

I started from scratch installed filebeat and enabled nginx module ran setup and then send directly to elasticsearch all the visualizations and dashboards load. That's not to say that you're not running into issues but from a clean configuration it should work

Hi @stephenb, thanks for your reply and analysis. Currently I am using version 7.1 with Kibana and Elasticsearch. I notice that the visualizations cannot find the correct indices and this has caused the template cannot be loaded. May I know is there a way where we can list out all the fields and from there re-link all the required fields.

Secondly, I notice that on the other end (The Nginx server) the log files (.access and .error) log files contain very little information. Things like locations all these are not available. This could be one of the reason why my Map's visualization cannot be loaded with "No Data Found".

Kindly advise and share your thoughts.

Hi all, if I perform a "filebeat export template > test.json" and the file showing there are fields that are required, e.g geo.location etc. But when I open the visualization, it prompted me "No Data". Why is it so ?

hi @skyluke.1987

filebeat export template > test.json

This command shows what template will be loaded when it is run not what IS currently loaded in the Elasticsearch cluster.

If you want to see what is currently loaded use this

curl http://localhost:9200/_template/filebeat-7.1.1

Or go to the Dev Tools and Run.

GET /_template/filebeat-7.1.1

Also confusing to me is that some of your screen shots show logstash are you using logstash as well? I would first get the simple Filebeat -> Elasticsearch directly.

The screen shot above shows the logstash output not filebeat setup that is a little more complex to setup.

Me if I were you I would start with a clean setup or you will need to remove the template, index patterns and existing filebeat indexes

Just to test it out...

I just built a brand new 7.1.1 single node Elasticsearch and Kibana on localhost and everything works fine the first time. I did not use Logstash.

I simply started Elasticsearch and Kibana without editing any settings.

Enabled ngnix module

./filebeat modules enable nginx

ran setup

./filebeat setup

Downloaded the nginx example logs file (see below for link).

edit the modules/nginx.yml and set the path to the nginx log file I just downloaded.

# Module: nginx
# Docs:

- module: nginx
  # Access logs
    enabled: true

    # Set custom paths for the log files. If left empty,
    # Filebeat will choose the paths depending on your OS.
    var.paths: [ "/Users/sbrown/Downloads/nginx_logs.log" ]

Then started filebeat

./filebeat -e

The data loaded and the visualizations work fine with 1 exception there are no error logs so those are blank, this data set only contains access logs. The data is from May 2015.

Here is the data I loaded....

Hi @stephenb, thanks for your detailed reply. Because of my setup involved these 2 different servers (A & B), but my output settings in Filebeat.yml is output to ES, is that how it works?

Secondly, I have the visualize template and the dashboards in the kibana, I am guessing that these has caused some issues when I re-run the ./filebeat setup command and also the load template command. May I know is there a method which I can clean this situation and do a clean install ?


Do you have this field / data in your index? When I open the dashboards, there are many info cannot be found.

Hi, for your reference these are the fields:

  "default_field" : [

Part 2 :


Part 3: