Trying to use filebeat postgresql conf.d to push logs to kibana

Hey guys,

I'm trying to configure filebeat output to logstash, the full pipeline is as follows:

filebeat > redis cache > logstash > elastic > kibana

Below is a screenshot of what I get in kibana

About two logs come in and then stop. Looking at filebeat logs, I see the same message constantly

    Jul 07 11:05:05 DEVCCM13DCTDB01.cpp.nonlive filebeat[27775]: 2020-07-07T11:05:05.187Z        DEBUG        [harvester]        log/log.go:107        End of file reached: /var/log/pg_log/postgresql-Tue.log; Backoff now.

Below is the config for filebeat postgres

[root@DEVCCM13DCTDB01 filebeat]# cat conf.d/postgresql.yml
- type: log
  paths:
    - /var/log/pg_log/postgresql-*.log
  scan_frequency: 5s
  fields:
    custom_log_type: postgresql

Can anyone help me understand why this is?

3 Likes

Hi!

The log you shared means that the logfile you are collecting from has no more new log lines to collect from. Could you confirm that?

C.

Hi Chris,

The logfile in question is actively being written into

[root@DEVCCM13DCTDB01 filebeat]# lsof | grep pg_log
lsof: WARNING: can't stat() cifs file system /mnt/database
      Output information may be incomplete.
filebeat   27775                  root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
filebeat   27775  27776           root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
filebeat   27775  27777           root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
filebeat   27775  27778           root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
filebeat   27775  27779           root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
filebeat   27775  27780           root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
filebeat   27775  27781           root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
filebeat   27775  27782           root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
filebeat   27775  27783           root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
filebeat   27775  27784           root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
filebeat   27775  27785           root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
filebeat   27775  27786           root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
filebeat   27775  27787           root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
filebeat   27775  27788           root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
filebeat   27775  27789           root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
filebeat   27775  27790           root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
filebeat   27775  27819           root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
filebeat   27775  27961           root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
filebeat   27775  28066           root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
filebeat   27775  35175           root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
filebeat   27775  35176           root    6r      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log
postgres   42393              postgres    4w      REG                8,2    115718   33590374 /var/log/pg_log/postgresql-Tue.log

It updates quite often aswell

[root@DEVCCM13DCTDB01 filebeat]# tail -n 30 -f /var/log/pg_log/postgresql-Tue.log
2020-07-14 10:48:14.742 UTC  @ [48728-10201] : LOG:  archive command failed with exit code 1
cp: failed to access ‘/mnt/database/00000001000000A700000075’: Permission denied
2020-07-14 10:48:15.787 UTC  @ [48728-10202] : LOG:  archive command failed with exit code 1
cp: failed to access ‘/mnt/database/00000001000000A700000075’: Permission denied
2020-07-14 10:48:16.832 UTC  @ [48728-10203] : LOG:  archive command failed with exit code 1
2020-07-14 10:48:16.832 UTC  @ [48728-10204] : WARNING:  archiving transaction log file "00000001000000A700000075" failed too many times, will try again later
2020-07-14 10:48:26.679 UTC  @ [48724-53973] : LOG:  checkpoint starting: time
2020-07-14 10:48:27.298 UTC  @ [48724-53974] : LOG:  checkpoint complete: wrote 6 buffers (0.0%); 0 transaction log file(s) added, 0 removed, 0 recycled; write=0.525 s, sync=0.029 s, total=0.618 s; sync files=4, longest=0.015 s, average=0.007 s; distance=42 kB, estimate=26198 kB
cp: failed to access ‘/mnt/database/00000001000000A700000075’: Permission denied
2020-07-14 10:48:54.403 UTC  @ [48728-10205] : LOG:  archive command failed with exit code 1
cp: failed to access ‘/mnt/database/00000001000000A700000075’: Permission denied
2020-07-14 10:48:55.458 UTC  @ [48728-10206] : LOG:  archive command failed with exit code 1
cp: failed to access ‘/mnt/database/00000001000000A700000075’: Permission denied
2020-07-14 10:48:56.503 UTC  @ [48728-10207] : LOG:  archive command failed with exit code 1
2020-07-14 10:48:56.503 UTC  @ [48728-10208] : WARNING:  archiving transaction log file "00000001000000A700000075" failed too many times, will try again later
2020-07-14 10:49:33.764 UTC 10.89.90.20(42980) xxx@xxxviewstore [24197-2] COMMIT: LOG:  duration: 537.201 ms  execute S_3: COMMIT
cp: failed to access ‘/mnt/database/00000001000000A700000075’: Permission denied
2020-07-14 10:49:56.623 UTC  @ [48728-10209] : LOG:  archive command failed with exit code 1
cp: failed to access ‘/mnt/database/00000001000000A700000075’: Permission denied
2020-07-14 10:49:57.676 UTC  @ [48728-10210] : LOG:  archive command failed with exit code 1
cp: failed to access ‘/mnt/database/00000001000000A700000075’: Permission denied
2020-07-14 10:49:58.721 UTC  @ [48728-10211] : LOG:  archive command failed with exit code 1
2020-07-14 10:49:58.721 UTC  @ [48728-10212] : WARNING:  archiving transaction log file "00000001000000A700000075" failed too many times, will try again later
cp: failed to access ‘/mnt/database/00000001000000A700000075’: Permission denied
2020-07-14 10:50:58.798 UTC  @ [48728-10213] : LOG:  archive command failed with exit code 1
cp: failed to access ‘/mnt/database/00000001000000A700000075’: Permission denied
2020-07-14 10:50:59.844 UTC  @ [48728-10214] : LOG:  archive command failed with exit code 1
cp: failed to access ‘/mnt/database/00000001000000A700000075’: Permission denied
2020-07-14 10:51:00.888 UTC  @ [48728-10215] : LOG:  archive command failed with exit code 1
2020-07-14 10:51:00.888 UTC  @ [48728-10216] : WARNING:  archiving transaction log file "00000001000000A700000075" failed too many times, will try again later

Thanks!

Could you run Filbeat in debug mode (./filebeat -e -d "*") and share the logs?

C.

Had to cut out chunks to satisfy the character limit, hope this helps.

[root@DEVCCM13DCTDB01 filebeat]# filebeat -e -d "*"
2020-07-14T13:39:41.401Z        INFO    instance/beat.go:610    Home path: [/usr/share/filebeat] Config path: [/etc/filebeat] Data path: [/var/lib/filebeat] Logs path: [/var/log/filebeat]
2020-07-14T13:39:41.402Z        DEBUG   [beat]  instance/beat.go:662    Beat metadata path: /var/lib/filebeat/meta.json
2020-07-14T13:39:41.402Z        INFO    instance/beat.go:618    Beat ID: b90d9dd6-2a7d-459c-86fa-546f7bb236f3
2020-07-14T13:39:41.402Z        DEBUG   [conditions]    conditions/conditions.go:98     New condition contains: map[]
2020-07-14T13:39:41.402Z        DEBUG   [processors]    processors/processor.go:101     Generated new processors: add_fields={"fields":{"environment":"dev"}}, add_fields={"fields":{"custom_log_type":"audit"}}, condition=contains: map[]
2020-07-14T13:39:41.402Z        DEBUG   [seccomp]       seccomp/seccomp.go:117  Loading syscall filter  {"seccomp_filter": {"no_new_privs":true,"flag":"tsync","policy":{"default_action":"errno","syscalls":[{"names":["accept","accept4","access","arch_prctl","bind","brk","chmod","clock_gettime","clone","close","connect","dup","dup2","epoll_create","epoll_create1","epoll_ctl","epoll_pwait","epoll_wait","exit","exit_group","fchdir","fchmod","fchown","fcntl","fdatasync","flock","fstat","fstatfs","fsync","ftruncate","futex","getcwd","getdents","getdents64","geteuid","getgid","getpeername","getpid","getppid","getrandom","getrlimit","getrusage","getsockname","getsockopt","gettid","gettimeofday","getuid","inotify_add_watch","inotify_init1","inotify_rm_watch","ioctl","kill","listen","lseek","lstat","madvise","mincore","mkdirat","mmap","mprotect","munmap","nanosleep","newfstatat","open","openat","pipe","pipe2","poll","ppoll","pread64","pselect6","pwrite64","read","readlink","readlinkat","recvfrom","recvmmsg","recvmsg","rename","renameat","rt_sigaction","rt_sigprocmask","rt_sigreturn","sched_getaffinity","sched_yield","sendfile","sendmmsg","sendmsg","sendto","set_robust_list","setitimer","setsockopt","shutdown","sigaltstack","socket","splice","stat","statfs","sysinfo","tgkill","time","tkill","uname","unlink","unlinkat","wait4","waitid","write","writev"],"action":"allow"}]}}}
2020-07-14T13:39:41.402Z        INFO    [seccomp]       seccomp/seccomp.go:124  Syscall filter successfully installed
2020-07-14T13:39:41.402Z        INFO    [beat]  instance/beat.go:941    Beat info       {"system_info": {"beat": {"path": {"config": "/etc/filebeat", "data": "/var/lib/filebeat", "home": "/usr/share/filebeat", "logs": "/var/log/filebeat"}, "type": "filebeat", "uuid": "b90d9dd6-2a7d-459c-86fa-546f7bb236f3"}}}
2020-07-14T13:39:41.403Z        INFO    [beat]  instance/beat.go:950    Build info      {"system_info": {"build": {"commit": "60dd883ca29e1fdd5b8b075bd5f3698948b1d44d", "libbeat": "7.5.1", "time": "2019-12-16T21:56:14.000Z", "version": "7.5.1"}}}
2020-07-14T13:39:41.403Z        INFO    [beat]  instance/beat.go:953    Go runtime info {"system_info": {"go": {"os":"linux","arch":"amd64","max_procs":8,"version":"go1.12.12"}}}
2020-07-14T13:39:41.407Z        INFO    [beat]  instance/beat.go:957    Host info       {"system_info": {"host": {"architecture":"x86_64","boot_time":"2020-05-18T23:22:06Z","containerized":false,"name":"DEVCCM13DCTDB01.cpp.nonlive","ip":["xxx","xxx"],"kernel_version":"3.10.0-1062.4.1.el7.x86_64","mac":["xxx","xxx"],"os":{"family":"redhat","platform":"rhel","name":"Red Hat Enterprise Linux Server","version":"7.4 (Maipo)","major":7,"minor":4,"patch":0,"codename":"Maipo"},"timezone":"UTC","timezone_offset_sec":0,"id":"6ca77495007b44b5b7983fa8f0e34d11"}}}
2020-07-14T13:39:41.408Z        INFO    [beat]  instance/beat.go:986    Process info    {"system_info": {"process": {"capabilities": {"inheritable":null,"permitted":["chown","dac_override","dac_read_search","fowner","fsetid","kill","setgid","setuid","setpcap","linux_immutable","net_bind_service","net_broadcast","net_admin","net_raw","ipc_lock","ipc_owner","sys_module","sys_rawio","sys_chroot","sys_ptrace","sys_pacct","sys_admin","sys_boot","sys_nice","sys_resource","sys_time","sys_tty_config","mknod","lease","audit_write","audit_control","setfcap","mac_override","mac_admin","syslog","wake_alarm","block_suspend"],"effective":["chown","dac_override","dac_read_search","fowner","fsetid","kill","setgid","setuid","setpcap","linux_immutable","net_bind_service","net_broadcast","net_admin","net_raw","ipc_lock","ipc_owner","sys_module","sys_rawio","sys_chroot","sys_ptrace","sys_pacct","sys_admin","sys_boot","sys_nice","sys_resource","sys_time","sys_tty_config","mknod","lease","audit_write","audit_control","setfcap","mac_override","mac_admin","syslog","wake_alarm","block_suspend"],"bounding":["chown","dac_override","dac_read_search","fowner","fsetid","kill","setgid","setuid","setpcap","linux_immutable","net_bind_service","net_broadcast","net_admin","net_raw","ipc_lock","ipc_owner","sys_module","sys_rawio","sys_chroot","sys_ptrace","sys_pacct","sys_admin","sys_boot","sys_nice","sys_resource","sys_time","sys_tty_config","mknod","lease","audit_write","audit_control","setfcap","mac_override","mac_admin","syslog","wake_alarm","block_suspend"],"ambient":null}, "cwd": "/etc/filebeat", "exe": "/usr/share/filebeat/bin/filebeat", "name": "filebeat", "pid": 12287, "ppid": 56336, "seccomp": {"mode":"filter","no_new_privs":true}, "start_time": "2020-07-14T13:39:40.810Z"}}}
2020-07-14T13:39:41.408Z        INFO    instance/beat.go:297    Setup Beat: filebeat; Version: 7.5.1
2020-07-14T13:39:41.408Z        DEBUG   [beat]  instance/beat.go:323    Initializing output plugins
2020-07-14T13:39:41.409Z        DEBUG   [publisher]     pipeline/consumer.go:137        start pipeline event consumer
2020-07-14T13:39:41.409Z        INFO    [publisher]     pipeline/module.go:97   Beat name: DEVCCM13DCTDB01.cpp.nonlive
2020-07-14T13:39:41.409Z        WARN    beater/filebeat.go:152  Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2020-07-14T13:39:41.409Z        INFO    [monitoring]    log/log.go:118  Starting metrics logging every 30s
2020-07-14T13:39:41.409Z        INFO    kibana/client.go:117    Kibana url: xxx:5601
2020-07-14T13:39:41.604Z        INFO    kibana/client.go:117    Kibana url: xxx:5601
2020-07-14T13:39:41.630Z        DEBUG   [dashboards]    dashboards/kibana_loader.go:146 Initialize the Kibana 7.5.1 loader
2020-07-14T13:39:41.630Z        DEBUG   [dashboards]    dashboards/kibana_loader.go:146 Kibana URL http://elk-kibana-lb.mdv.cpp.nonlive:5601
2020-07-14T13:39:41.630Z        DEBUG   [dashboards]    dashboards/kibana_loader.go:146 Importing directory /usr/share/filebeat/kibana/7
2020-07-14T13:39:42.191Z        DEBUG   [dashboards]    dashboards/kibana_loader.go:146 Import directory /usr/share/filebeat/kibana/7
2020-07-14T13:40:11.414Z        INFO    [monitoring]    log/log.go:145  Non-zero metrics in the last 30s        {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":60,"time":{"ms":71}},"total":{"ticks":390,"time":{"ms":415},"value":0},"user":{"ticks":330,"time":{"ms":344}}},"handles":{"limit":{"hard":4096,"soft":1024},"open":7},"info":{"ephemeral_id":"784c5249-ad63-428e-a9a6-614b51284565","uptime":{"ms":30036}},"memstats":{"gc_next":8380528,"memory_alloc":4296952,"memory_total":62852896,"rss":45707264},"runtime":{"goroutines":16}},"filebeat":{"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"output":{"type":"redis"},"pipeline":{"clients":0,"events":{"active":0}}},"registrar":{"states":{"current":0}},"system":{"cpu":{"cores":8},"load":{"1":0.06,"15":0.2,"5":0.14,"norm":{"1":0.0075,"15":0.025,"5":0.0175}}}}}}
2020-07-14T13:40:34.027Z        INFO    instance/beat.go:780    Kibana dashboards successfully loaded.
2020-07-14T13:40:34.027Z        INFO    instance/beat.go:429    filebeat start running.
2020-07-14T13:40:34.028Z        DEBUG   [test]  registrar/migrate.go:159        isFile(/var/lib/filebeat/registry) -> false
2020-07-14T13:40:34.028Z        DEBUG   [test]  registrar/migrate.go:159        isFile() -> false
2020-07-14T13:40:34.028Z        DEBUG   [test]  registrar/migrate.go:152        isDir(/var/lib/filebeat/registry/filebeat) -> true
2020-07-14T13:40:34.028Z        DEBUG   [test]  registrar/migrate.go:159        isFile(/var/lib/filebeat/registry/filebeat/meta.json) -> true
2020-07-14T13:40:34.028Z        DEBUG   [registrar]     registrar/migrate.go:51 Registry type '0' found
2020-07-14T13:40:34.028Z        DEBUG   [registrar]     registrar/registrar.go:125      Registry file set to: /var/lib/filebeat/registry/filebeat/data.json
2020-07-14T13:40:34.028Z        INFO    registrar/registrar.go:145      Loading registrar data from /var/lib/filebeat/registry/filebeat/data.json
2020-07-14T13:40:34.028Z        INFO    registrar/registrar.go:152      States Loaded from registrar: 12
2020-07-14T13:40:34.028Z        WARN    beater/filebeat.go:368  Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.

Hi again!

I don't think this part of logs is quite helpful. Is there anything related to the postgresql module? Are there any errors in long term?

Also do you think that Ingest pipeline related message can be relevant? What kind of configuration you use in Logstash? Usually ingest_pipelines must be loaded in Elasticsearch when Logstash is just a passthrough.

In addition do you see events being created and printed in the output of Filebeat? Maybe you can enable console output and check if logs are being collected by Filebeat so as to better locate where the issue is in your whole pipeline.

Just some thoughts that might be helpful.

C.

@ChrsMark

I did attempt to use modules.d/postgresql.yml, this had its own issues so opted to use conf.d/postgresql.yml instead.

Correct me if i'm wrong, but isn't the ingest pipeline stuff just for filebeat modules? This is my setup for logstash:

[root@MDVELKLGPRO01 logstash]# ll conf.d/
total 68
-rw-r--r--. 1 root root 1901 Jul 14 13:57 100_input.conf
-rw-r--r--. 1 root root  300 Jul  3 11:52 101_messages.conf
-rw-r--r--. 1 root root  475 Jul  3 12:49 102_syslog.conf
-rw-r--r--. 1 root root  666 Jul  3 12:49 103_secure.conf
-rw-r--r--. 1 root root  510 Jul  3 11:56 104_audit.conf
-rw-r--r--. 1 root root  487 Jul  3 12:49 105_zabbix_agent.conf
-rw-r--r--. 1 root root  491 Jul  3 12:49 106_clamd.conf
-rw-r--r--. 1 root root 1093 Jul  3 12:49 107_maillog.conf
-rw-r--r--. 1 root root  153 Feb 14 11:51 109_eventhub.conf
-rw-r--r--. 1 root root  368 Jul  3 12:49 110_cron.conf
-rw-r--r--. 1 root root  197 Jun  2 13:44 111_wildfly.conf
-rw-r--r--. 1 root root 1306 Jul  3 12:49 112_nginx.conf
-rw-r--r--. 1 root root 1752 Jul  3 12:49 113_haproxy.conf
-rw-r--r--. 1 root root 1902 Jul 14 10:05 114_postgresql.conf
-rw-r--r--. 1 root root  143 Jul  3 12:49 120_winlogbeat.conf
-rw-r--r--. 1 root root 1918 Jul  9 14:35 999_outputs.conf
drwxr-xr-x. 2 root root   69 Mar 10 13:54 grok-patterns

Contents of my postgres conf

[root@MDVELKLGPRO01 specifications]# cat /etc/logstash/conf.d/114_postgresql.conf
filter {
  if [fields][custom_log_type] == "postgresql" {
    #codec => multiline {
    #  pattern => "^\s"
    #  what => "previous"
    #}
    if "COPY" in [message] { drop {} }
   else if "duration" in [message] {
      if "execute" in [message] {
        mutate { add_tag => ["sql-query"] }
        grok {
          match => ["message", "%{DATESTAMP:timestamp} CET \[%{POSINT:pid}\]: \[%{POSINT}-1] user=%{NOTSPACE:connection_id} %{WORD:level}:  duration: %{NUMBER:duration_ms} ms  execute %{GREEDYDATA}: %{GREEDYDATA:request}"]
        }
      } else if "parse" in [message] {
        mutate { add_tag => ["sql-parse"] }
        grok {
          match => ["message", "%{DATESTAMP:timestamp} CET \[%{POSINT:pid}\]: \[%{POSINT}-1] user=%{GREEDYDATA:connection_id} %{WORD:level}:  duration: %{NUMBER:duration_ms} ms  parse %{GREEDYDATA}: %{GREEDYDATA:request}"]
        }
      } else if "bind" in [message] {
        mutate { add_tag => ["sql-bind"] }
        grok {
          match => ["message", "%{DATESTAMP:timestamp} CET \[%{POSINT:pid}\]: \[%{POSINT}-1] user=%{GREEDYDATA:connection_id} %{WORD:level}:  duration: %{NUMBER:duration_ms} ms  bind %{GREEDYDATA}: %{GREEDYDATA:request}"]
        }
      } else if "statement" in [message] {
        mutate { add_tag => ["sql-query"] }
        grok {
          match => ["message", "%{DATESTAMP:timestamp} CET \[%{POSINT:pid}\]: \[%{POSINT}-1] user=%{NOTSPACE:connection_id} %{WORD:level}:  duration: %{NUMBER:duration_ms} ms  statement: %{GREEDYDATA:request}"]
        }
      }
      mutate { convert => [ "execution_ms", "float" ] }
      ruby { code => "event['elapsed'] = event['duration_ms'].to_f / 1000" }
    } else {
      mutate { add_tag => "db-message" }
      grok {
        match => ["message", "%{DATESTAMP:timestamp} CET \[%{POSINT:pid}\]: \[%{POSINT}-1] user=%{NOTSPACE:connection_id} %{WORD:level}: %{GREEDYDATA:message}"]
      }
    }
  }
}

Hi!

So you use logstash postgresql input to parse the logs, right? If that's the case we first need to verify that logs are sent from Filebeat to Logstash. Could you verify that? In this we will be able to isolate the problem.

Also do you see anything that seems problematic in your Logstash output/logs?

So thats where, I believe, the issue lies.
Filebeat doesn't seem to be constantly pushing the logs out.
It initially gives something like

Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: {
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "@timestamp": "2020-07-16T09:02:56.714Z",
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "@metadata": {
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "beat": "filebeat",
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "type": "_doc",
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "version": "7.5.1"
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: },
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "agent": {
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "hostname": "DEVCCM13DCTDB01.cpp.nonlive",
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "id": "b90d9dd6-2a7d-459c-86fa-546f7bb236f3",
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "version": "7.5.1",
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "type": "filebeat",
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "ephemeral_id": "e369ff1d-a095-408e-abb0-c45e65d9b5f9"
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: },
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "ecs": {
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "version": "1.1.0"
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: },
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "log": {
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "offset": 39684,
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "file": {
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "path": "/var/log/pg_log/postgresql-Thu.log"
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: }
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: },
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "message": "2020-07-16 09:02:49.264 UTC  @ [30097-475] : LOG:  checkpoint starting: time",
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "fields": {
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "custom_log_type": "postgresql",
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "environment": "dev"
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: },
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "input": {
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "type": "log"
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: },
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "host": {
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "name": "DEVCCM13DCTDB01.cpp.nonlive"
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: }
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: }
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: {
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "@timestamp": "2020-07-16T09:02:56.714Z",
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "@metadata": {
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "beat": "filebeat",
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "type": "_doc",
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "version": "7.5.1"
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: },
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "log": {
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "offset": 39761,
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "file": {
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "path": "/var/log/pg_log/postgresql-Thu.log"
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: }
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: },
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "message": "2020-07-16 09:02:49.886 UTC  @ [30097-476] : LOG:  checkpoint complete: wrote 6 buffers (0.0%); 0 transaction log file(s) added, 0 removed, 0 recycled; write=0.535 s, sync=0.021 s, total=0.622 s; sync files=4, longest=0.009 s, average=0.005 s; distance=16364 kB, estimate=16364 kB",
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "input": {
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "type": "log"
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: },
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "fields": {
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "environment": "dev",
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "custom_log_type": "postgresql"
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: },
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "ecs": {
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "version": "1.1.0"
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: },
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "host": {
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "name": "DEVCCM13DCTDB01.cpp.nonlive"
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: },
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "agent": {
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "type": "filebeat",
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "ephemeral_id": "e369ff1d-a095-408e-abb0-c45e65d9b5f9",
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "hostname": "DEVCCM13DCTDB01.cpp.nonlive",
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "id": "b90d9dd6-2a7d-459c-86fa-546f7bb236f3",
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: "version": "7.5.1"
Jul 16 09:02:57 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: }

Followed by this afterwards, nonstop

[root@DEVCCM13DCTDB01 filebeat]# journalctl -xef --unit filebeat.service | grep postgresql-Thu
Jul 16 09:10:54 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: 2020-07-16T09:10:54.050Z        DEBUG        [input]        log/input.go:511        Update existing file for harvesting: /var/log/pg_log/postgresql-Thu.log, offset: 40397
Jul 16 09:10:54 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: 2020-07-16T09:10:54.050Z        DEBUG        [input]        log/input.go:563        Harvester for file is still running: /var/log/pg_log/postgresql-Thu.log
Jul 16 09:10:56 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: 2020-07-16T09:10:56.737Z        DEBUG        [harvester]        log/log.go:107        End of file reached: /var/log/pg_log/postgresql-Thu.log; Backoff now.
Jul 16 09:10:59 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: 2020-07-16T09:10:59.050Z        DEBUG        [input]        log/input.go:421        Check file for harvesting: /var/log/pg_log/postgresql-Thu.log
Jul 16 09:10:59 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: 2020-07-16T09:10:59.050Z        DEBUG        [input]        log/input.go:511        Update existing file for harvesting: /var/log/pg_log/postgresql-Thu.log, offset: 40397
Jul 16 09:10:59 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: 2020-07-16T09:10:59.050Z        DEBUG        [input]        log/input.go:563        Harvester for file is still running: /var/log/pg_log/postgresql-Thu.log
Jul 16 09:11:04 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: 2020-07-16T09:11:04.051Z        DEBUG        [input]        log/input.go:421        Check file for harvesting: /var/log/pg_log/postgresql-Thu.log
Jul 16 09:11:04 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: 2020-07-16T09:11:04.051Z        DEBUG        [input]        log/input.go:511        Update existing file for harvesting: /var/log/pg_log/postgresql-Thu.log, offset: 40397
Jul 16 09:11:04 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: 2020-07-16T09:11:04.051Z        DEBUG        [input]        log/input.go:563        Harvester for file is still running: /var/log/pg_log/postgresql-Thu.log
Jul 16 09:11:06 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: 2020-07-16T09:11:06.738Z        DEBUG        [harvester]        log/log.go:107        End of file reached: /var/log/pg_log/postgresql-Thu.log; Backoff now.
Jul 16 09:11:09 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: 2020-07-16T09:11:09.052Z        DEBUG        [input]        log/input.go:421        Check file for harvesting: /var/log/pg_log/postgresql-Thu.log
Jul 16 09:11:09 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: 2020-07-16T09:11:09.052Z        DEBUG        [input]        log/input.go:511        Update existing file for harvesting: /var/log/pg_log/postgresql-Thu.log, offset: 40397
Jul 16 09:11:09 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: 2020-07-16T09:11:09.052Z        DEBUG        [input]        log/input.go:563        Harvester for file is still running: /var/log/pg_log/postgresql-Thu.log
Jul 16 09:11:14 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: 2020-07-16T09:11:14.053Z        DEBUG        [input]        log/input.go:421        Check file for harvesting: /var/log/pg_log/postgresql-Thu.log
Jul 16 09:11:14 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: 2020-07-16T09:11:14.053Z        DEBUG        [input]        log/input.go:511        Update existing file for harvesting: /var/log/pg_log/postgresql-Thu.log, offset: 40397
Jul 16 09:11:14 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: 2020-07-16T09:11:14.053Z        DEBUG        [input]        log/input.go:563        Harvester for file is still running: /var/log/pg_log/postgresql-Thu.log
Jul 16 09:11:16 DEVCCM13DCTDB01.cpp.nonlive filebeat[42561]: 2020-07-16T09:11:16.738Z        DEBUG        [harvester]        log/log.go:107        End of file reached: /var/log/pg_log/postgresql-Thu.log; Bac

I suspect extra configuration is required of the filebeat, but not sure what exactly.
Postgres logs about every 5mins.

Well, to be honest I don't really make out what could be the issue here.

Maybe setting close_inactive to something like 10m could resolve it https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-input-log.html#filebeat-input-log-close-inactive.

I added the following setting in etc/filebeat/conf.d/postgresql.yml and got the following output from filebeat logs

Jul 16 14:29:26 DEVCCM13DCTDB01.cpp.nonlive filebeat[78982]: 2020-07-16T14:29:26.492Z        DEBUG        [input]        log/input.go:563        Harvester for file is still running: /var/log/pg_log/postgresql-Thu.log
Jul 16 14:29:31 DEVCCM13DCTDB01.cpp.nonlive filebeat[78982]: 2020-07-16T14:29:31.464Z        DEBUG        [harvester]        log/log.go:107        End of file reached: /var/log/pg_log/postgresql-Thu.log; Backoff now.
Jul 16 14:29:31 DEVCCM13DCTDB01.cpp.nonlive filebeat[78982]: 2020-07-16T14:29:31.493Z        DEBUG        [input]        log/input.go:421        Check file for harvesting: /var/log/pg_log/postgresql-Thu.log
Jul 16 14:29:31 DEVCCM13DCTDB01.cpp.nonlive filebeat[78982]: 2020-07-16T14:29:31.493Z        DEBUG        [input]        log/input.go:511        Update existing file for harvesting: /var/log/pg_log/postgresql-Thu.log, offset: 102436
Jul 16 14:29:31 DEVCCM13DCTDB01.cpp.nonlive filebeat[78982]: 2020-07-16T14:29:31.493Z        DEBUG        [input]        log/input.go:563        Harvester for file is still running: /var/log/pg_log/postgresql-Thu.log
Jul 16 14:29:36 DEVCCM13DCTDB01.cpp.nonlive filebeat[78982]: 2020-07-16T14:29:36.494Z        DEBUG        [input]        log/input.go:421        Check file for harvesting: /var/log/pg_log/postgresql-Thu.log
Jul 16 14:29:36 DEVCCM13DCTDB01.cpp.nonlive filebeat[78982]: 2020-07-16T14:29:36.494Z        DEBUG        [input]        log/input.go:511        Update existing file for harvesting: /var/log/pg_log/postgresql-Thu.log, offset: 102436
Jul 16 14:29:36 DEVCCM13DCTDB01.cpp.nonlive filebeat[78982]: 2020-07-16T14:29:36.494Z        DEBUG        [input]        log/input.go:563        Harvester for file is still running: /var/log/pg_log/postgresql-Thu.log
**Jul 16 14:29:41 DEVCCM13DCTDB01.cpp.nonlive filebeat[78982]: "path": "/var/log/pg_log/postgresql-Thu.log"**
**Jul 16 14:29:41 DEVCCM13DCTDB01.cpp.nonlive filebeat[78982]: "path": "/var/log/pg_log/postgresql-Thu.log"**
Jul 16 14:29:41 DEVCCM13DCTDB01.cpp.nonlive filebeat[78982]: 2020-07-16T14:29:41.465Z        DEBUG        [harvester]        log/log.go:107        End of file reached: /var/log/pg_log/postgresql-Thu.log; Backoff now.
Jul 16 14:29:41 DEVCCM13DCTDB01.cpp.nonlive filebeat[78982]: 2020-07-16T14:29:41.494Z        DEBUG        [input]        log/input.go:421        Check file for harvesting: /var/log/pg_log/postgresql-Thu.log
Jul 16 14:29:41 DEVCCM13DCTDB01.cpp.nonlive filebeat[78982]: 2020-07-16T14:29:41.494Z        DEBUG        [input]        log/input.go:511        Update existing file for harvesting: /var/log/pg_log/postgresql-Thu.log, offset: 102792
Jul 16 14:29:41 DEVCCM13DCTDB01.cpp.nonlive filebeat[78982]: 2020-07-16T14:29:41.494Z        DEBUG        [input]        log/input.go:563        Harvester for file is still running: /var/log/pg_log/postgresql-Thu.log
Jul 16 14:29:42 DEVCCM13DCTDB01.cpp.nonlive filebeat[78982]: 2020-07-16T14:29:42.465Z        DEBUG        [harvester]        log/log.go:107        End of file reached: /var/log/pg_log/postgresql-Thu.log; Backoff now.
Jul 16 14:29:44 DEVCCM13DCTDB01.cpp.nonlive filebeat[78982]: 2020-07-16T14:29:44.465Z        DEBUG        [harvester]        log/log.go:107        End of file reached: /var/log/pg_log/postgresql-Thu.log; Backoff now.
Jul 16 14:29:46 DEVCCM13DCTDB01.cpp.nonlive filebeat[78982]: 2020-07-16T14:29:46.495Z        DEBUG        [input]        log/input.go:421        Check file for harvesting: /var/log/pg_log/postgresql-Thu.log
Jul 16 14:29:46 DEVCCM13DCTDB01.cpp.nonlive filebeat[78982]: 2020-07-16T14:29:46.495Z        DEBUG        [input]        log/input.go:511        Update existing file for harvesting: /var/log/pg_log/postgresql-Thu.log, offset: 102792

Aside from the usual chatter I see, I did notice these two new log entries (**) around the same time postgres wrote into /var/log/pg_log/postgresql-Thu.log

This, however, hasn't fixed the issue. I notice that filebeat (after bouncing the service) seems to initially read the file then for whatever reason, stops.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.