Filebeat > Nginx Module

Hi @stephenb, thanks for your reply and analysis. Currently I am using version 7.1 with Kibana and Elasticsearch. I notice that the visualizations cannot find the correct indices and this has caused the template cannot be loaded. May I know is there a way where we can list out all the fields and from there re-link all the required fields.

Secondly, I notice that on the other end (The Nginx server) the log files (.access and .error) log files contain very little information. Things like locations all these are not available. This could be one of the reason why my Map's visualization cannot be loaded with "No Data Found".

Kindly advise and share your thoughts.

Hi all, if I perform a "filebeat export template > test.json" and the file showing there are fields that are required, e.g geo.location etc. But when I open the visualization, it prompted me "No Data". Why is it so ?

hi @skyluke.1987

filebeat export template > test.json

This command shows what template will be loaded when it is run not what IS currently loaded in the Elasticsearch cluster.

If you want to see what is currently loaded use this

curl http://localhost:9200/_template/filebeat-7.1.1

Or go to the Dev Tools and Run.

GET /_template/filebeat-7.1.1

Also confusing to me is that some of your screen shots show logstash are you using logstash as well? I would first get the simple Filebeat -> Elasticsearch directly.

The screen shot above shows the logstash output not filebeat setup that is a little more complex to setup.

Me if I were you I would start with a clean setup or you will need to remove the template, index patterns and existing filebeat indexes

Just to test it out...

I just built a brand new 7.1.1 single node Elasticsearch and Kibana on localhost and everything works fine the first time. I did not use Logstash.

I simply started Elasticsearch and Kibana without editing any settings.

Enabled ngnix module

./filebeat modules enable nginx

ran setup

./filebeat setup

Downloaded the nginx example logs file (see below for link).

edit the modules/nginx.yml and set the path to the nginx log file I just downloaded.

# Module: nginx
# Docs:

- module: nginx
  # Access logs
    enabled: true

    # Set custom paths for the log files. If left empty,
    # Filebeat will choose the paths depending on your OS.
    var.paths: [ "/Users/sbrown/Downloads/nginx_logs.log" ]

Then started filebeat

./filebeat -e

The data loaded and the visualizations work fine with 1 exception there are no error logs so those are blank, this data set only contains access logs. The data is from May 2015.

Here is the data I loaded....

1 Like

Hi @stephenb, thanks for your detailed reply. Because of my setup involved these 2 different servers (A & B), but my output settings in Filebeat.yml is output to ES, is that how it works?

Secondly, I have the visualize template and the dashboards in the kibana, I am guessing that these has caused some issues when I re-run the ./filebeat setup command and also the load template command. May I know is there a method which I can clean this situation and do a clean install ?


Do you have this field / data in your index? When I open the dashboards, there are many info cannot be found.

Hi, for your reference these are the fields:

  "default_field" : [

Part 2 :


Part 3:


Part 4: Final


Did that come from?

GET /_template/filebeat-7.1.1

If so further down you will see this.......which is correct. The out put is very long.

   "source" : {
      "properties" : {
        "geo" : {
          "properties" : {
            "region_iso_code" : {
              "ignore_above" : 1024,
              "type" : "keyword"
            "continent_name" : {
              "ignore_above" : 1024,
              "type" : "keyword"
            "city_name" : {
              "ignore_above" : 1024,
              "type" : "keyword"
            "country_iso_code" : {
              "ignore_above" : 1024,
              "type" : "keyword"
            "country_name" : {
              "ignore_above" : 1024,
              "type" : "keyword"
            "name" : {
              "ignore_above" : 1024,
              "type" : "keyword"
            "region_name" : {
              "ignore_above" : 1024,
              "type" : "keyword"
            "location" : {
              "type" : "geo_point"

In the index pattern you should see these fields like this shown below...

if not delete that index pattern from the Kibana GUI
Mangement -> Index Patterns
and run ./filebeat setup again.

Apologies but I don't think I am able to help much more ... to do a clean install ... uninstall and re-install elasticsearch and make sure the the data directory under the elasticsearch install is removed before you reinstall.

Hi may I check with you. My client side is installing filebeat 7.3.0 while the ES server's Filebeat is on 7.1.1. Will there be an issue in terms of compatibility and the data format?

Hi @stephenb, I manage to get this fields, but the dashboard still cannot capture.

What I did is I removed the client Filebeat and installed the 7.1.1 version.


After that I notice I am facing some difficulties installing the logstash on the client server, its Centos 6.10 and with Java 1.8.0. Will there be any problem connecting these 2 servers ?

Hi is there anyone can help to troubleshoot this issue? thx

@stephenb , may I know what is your index name for this Nginx elastic ?

I came across some forum suggest that the template only recognize nginx-* indexes in order to load those templates.

If you use the filebeat nginx module with all the default settings the nginx logs will be indexed into indexes name with pattern filebeat-*

Hi bro, so this is expected? Are you able to help me on this issue? I have no solution to it

I found this online, wanted to load this template, but it fails.

Can anyone share why? is it incompatible ?

How can I check the Filebeat's dashboard compatibility and which version did I installed ?

Dear all, is there anyone can help on my question? It's been sometimes.

Almost 99% of my dashboards cannot display the data collected, while there are data collected from the other servers stored and received into our Elasticsearch DB. Wonder why the default dashboard cannot display?

I have tried various method to resolve this, but all of it just doesn't work.

Next, the indexes ./filebeat has been deleted and reindex previously, but yet there is no data is reflecting on the dashboards.