Postgresql Module in Filebeat Not Getting Start

Hi Team,
I am trying to integrate postgresql logs in Kibana using postgres module. Here is my filebeat.yml

######################## Filebeat Configuration ############################

#==========================  Modules configuration ============================
filebeat.modules:

#----------------------------- PostgreSQL Module -----------------------------
- module: postgresql
  # Logs
  log:
    enabled: true

    # Set custom paths for the log files. If left empty,
    # Filebeat will choose the paths depending on your OS.
    var.paths: /home/test.log

    # Prospector configuration (advanced). Any prospector configuration option
    # can be added under this section.
    #prospector:

#================================ Outputs ======================================

# Configure what output to use when sending the data collected by the beat.

#-------------------------- Elasticsearch output -------------------------------
output.elasticsearch:
  # Boolean flag to enable or disable the output module.
  enabled: true
  hosts: ["ip:port"]
  # Optional index name. The default is "filebeat" plus date
  # and generates [filebeat-]YYYY.MM.DD keys.
  # In case you modify this pattern you must update setup.template.name and setup.template.pattern accordingly.
  #index: "postgresqllog-%{+yyyy.MM.dd}"

#============================== Dashboards =====================================
# These settings control loading the sample dashboards to the Kibana index. Loading
# the dashboards are disabled by default and can be enabled either by setting the
# options here, or by using the `-setup` CLI flag or the `setup` command.
setup.dashboards.enabled: true

#============================== Kibana =====================================

# Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
# This requires a Kibana endpoint configuration.
setup.kibana:

  # Kibana Host
  # Scheme and port can be left out and will be set to the default (http and 5601)
  # In case you specify and additional path, the scheme is required: http://localhost:5601/path
  # IPv6 addresses should always be defined as: https://[2001:db8::1]:5601
  host: "ip:port"

When i am trying to setup filebeat using docker then following error comes up,
CRIT Exiting: Error getting config for fielset postgresql/log: Error interpreting the template of the prospector: template: text:3:22: executing "text" at <.paths>: range can't iterate over /home/test.log
Exiting: Error getting config for fielset postgresql/log: Error interpreting the template of the prospector: template: text:3:22: executing "text" at <.paths>: range can't iterate over /home/test.log

I think the message indicates paths is supposed to be a list of paths. Try:

- module: postgresql
  # Logs
  log:
    enabled: true

    # Set custom paths for the log files. If left empty,
    # Filebeat will choose the paths depending on your OS.
    var.paths:
      - '/home/test.log'

Hi @steffens
I tried this one also, no success.
Git the same error message.

The error occurs when expanding the filebeat configuration template.

This sample script passes a string instead of a list and fails with the same error: https://play.golang.org/p/sSwki4hdhKQ

Updating the environments 'paths' variable to be a list resolves the problem: https://play.golang.org/p/iAJ6QJqb8bO

How are you starting filebeat? Are you sure you use the correct config file + have correct indentation + used the - character to start a list?

If problem persists, please share output of: ./filebeat -e -v -d '*,config' -c <path/to/config/file> .

@steffens
I am running filebeat in docker , using following command.
docker run -v /home/filebeat.yml:/usr/share/filebeat/filebeat.yml docker.elastic.co/beats/filebeat:6.1.1

My console logs show the error
2018/01/10 12:11:29.461369 beat.go:203: INFO Setup Beat: filebeat; Version: 6.1.1
2018/01/10 12:11:29.461528 client.go:123: INFO Elasticsearch url: http://ip:port
2018/01/10 12:11:29.463108 module.go:76: INFO Beat name: e3affd7e454c
2018/01/10 12:11:29.482046 filebeat.go:62: INFO Enabled modules/filesets: postgresql (log), ()
2018/01/10 12:11:29.482692 beat.go:635: CRIT Exiting: Error getting config for fielset postgresql/log: Error interpreting the template of the prospector: template: text:3:22: executing "text" at <.paths>: range can't iterate over /home/test.log
Exiting: Error getting config for fielset postgresql/log: Error interpreting the template of the prospector: template: text:3:22: executing "text" at <.paths>: range can't iterate over /home/test.log

Also i have already shared my filebeat.yml with you. I tried using
var.paths:
- "/home/test.log"
But getting the above error. Can you help me with what I am doing wrong?

please properly format you output and configs using the </>-Button. The config file is sensitive to proper formatting (use spaces, no tabs).

Please the me the output as requested by at adding the run flag -d '*,config'.

@steffens
I was missing the braces part in specify the log file path.
It worked for me when i changed to:
var.paths: ["/home/Postgres_vv-qa-pgdb/*.log"]

Thanks. :slight_smile:
But now i am getting another error,
My filebeat is getting started but it is not picking up my logs , though there are several log files at specified location.
Here are the logs for the same:
2018/01/10 13:22:21.326242 metrics.go:23: INFO Metrics logging every 30s
2018/01/10 13:22:21.326387 beat.go:443: INFO Beat UUID: 893b3c74-2102-4990-ba12-bdf6d1816d7b
2018/01/10 13:22:21.326397 beat.go:203: INFO Setup Beat: filebeat; Version: 6.1.1
2018/01/10 13:22:21.326539 client.go:123: INFO Elasticsearch url: http://10.3.100.210:32328
2018/01/10 13:22:21.327001 module.go:76: INFO Beat name: d9d69fba8427
2018/01/10 13:22:21.327886 filebeat.go:62: INFO Enabled modules/filesets: postgresql (log), ()
2018/01/10 13:22:21.328210 client.go:123: INFO Elasticsearch url: http://10.3.100.210:32328
2018/01/10 13:22:21.909385 client.go:651: INFO Connected to Elasticsearch version 6.0.0
2018/01/10 13:22:21.909555 client.go:69: INFO Kibana url: http://10.3.100.210:32454
2018/01/10 13:22:48.442468 beat.go:551: INFO Kibana dashboards successfully loaded.
2018/01/10 13:22:48.442527 beat.go:276: INFO filebeat start running.
2018/01/10 13:22:48.442713 registrar.go:71: INFO No registry file found under: /usr/share/filebeat/data/registry. Creating a new registry file.
2018/01/10 13:22:48.659456 registrar.go:108: INFO Loading registrar data from /usr/share/filebeat/data/registry
2018/01/10 13:22:48.659509 registrar.go:119: INFO States Loaded from registrar: 0
2018/01/10 13:22:48.659529 crawler.go:48: INFO Loading Prospectors: 1
2018/01/10 13:22:48.659581 registrar.go:150: INFO Starting Registrar
2018/01/10 13:22:48.659890 prospector.go:87: INFO Starting prospector of type: log; ID: 17973440715270663115
2018/01/10 13:22:48.659920 crawler.go:82: INFO Loading and starting Prospectors completed. Enabled prospectors: 1
2018/01/10 13:22:51.327000 metrics.go:39: INFO Non-zero metrics in the last 30s: beat.info.uptime.ms=30001 beat.memstats.gc_next=4194304 beat.memstats.memory_alloc=2287344 beat.memstats.memory_total=8811168 filebeat.harvester.open_files=0 filebeat.harvester.running=0 libbeat.config.module.running=0 libbeat.output.type=elasticsearch libbeat.pipeline.clients=1 libbeat.pipeline.events.active=0 registrar.states.current=0 registrar.writes=1
2018/01/10 13:23:21.326620 metrics.go:39: INFO Non-zero metrics in the last 30s: beat.info.uptime.ms=30000 beat.memstats.gc_next=4194304 beat.memstats.memory_alloc=2302880 beat.memstats.memory_total=8826704 filebeat.harvester.open_files=0 filebeat.harvester.running=0 libbeat.config.module.running=0 libbeat.pipeline.clients=1 libbeat.pipeline.events.active=0 registrar.states.current=0

@steffens
Any help on this from your side will be very helpful for me.

If you add -d "porspector" to the filebeat CLI, it will print a File Configs: that prints the globs that Filebeat uses. Double check that the files are at the right location. Also, you might want to delete the registry file (on a tar.gz installation, it's data/registry) to force re-reading the log files in case they were already read.

Ah, now I see that you are running FIlebeat through docker. In that case, did you mount the folder where the PostgreSQL log files are in the docker container?

@tudor
Yes i am running Filebeat through docker, and has mounted the folder inside docker container.
This is the command i am using to run filebeat through docker
docker run -v /$path/filebeat.yml:/usr/share/filebeat/filebeat.yml -v /home/Postgres_vv-qa-pgdb/*.log:/home docker.elastic.co/beats/filebeat:6.1.1

Is there something i am missing in this?
In my filebeat.yml, i have given the
var.paths: ["/home/*.log"]

It is the path which i have mounted inside the docker container.

I'd recommend starting an interactive container with something like this:

docker run -ti -v /$path/filebeat.yml:/usr/share/filebeat/filebeat.yml -v /home/Postgres_vv-qa-pgdb/*.log:/home docker.elastic.co/beats/filebeat:6.1.1 bash

Then check that all paths are in the expected places.

Then from the same interactive session start filebeat -e -d "prospector" to debug if Filebeat opens the right paths.

@tudor Yes, it worked for me.
But the postgres module is not able to parse my logs, it gives me an error that grok debugger is failed to parse logs.

Can you share the error and sample logs?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.