Logstash delete postgresql module fields


I have an ELK platform version 7.1.0, to fix a timestamp inconsistency between my servers and Kibana I created a logstash filter, this filter read the timestamp from the logs and applied it in the filed @timestamp in Kibana, so far so good, now I´m trying to use some dashboards for PostgreSQL but when I tried to read the information from the logs all the PostgreSQL module fields that filebeat sent are deleted by logstash (user name, query, log duration, etc) so I was wondering if any of you know how to pass this fields trough logstash and avoid this data loss, this is the filter that I have to fix the timestamp issue.

input {
beats {
port => 5044
filter {
grok {
match => ["message", "%{TIMESTAMP_ISO8601:timestamp}"]
date {
match => ["timestamp", "ISO8601"]
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"

Log example from filebeat to elasticsearch

Log example using logstash filter



This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.