Logstash dropping a lot of logs

I have this current setup where Filebeat is reading Lumen logs and forwarding them to Logstash . And Logstash is parsing one of those field in log lines as JSON . I am only seeing partial outputs in my Kibana dashboard and it seems it's dropping a lot of logs . I am confused as in the console it shows no errors . Here are my configs:

filebeat.inputs:
- enabled: true
  paths:
    - /home/parthib/recruitcrm/Albatross/storage/logs/lumen-*.log
  type: log
  fields: {log_type: lumen_logs}
output.logstash:
  # The Logstash hosts
  hosts: ["localhost:5044"]
input {
  tcp {
    port => 10514
    type => syslog
  }
  beats {
    port => "5044"
    type => beats
  }
}

filter {

  if [type] == 'beats' and [fields][log_type] == "lumen_logs" {
    grok {
       match => { "message" => ["\[%{TIMESTAMP_ISO8601:php_timestamp}\] %{DATA:origin_hostname}\.%{DATA:log_level}: (%{URI:request_url} )?%{GREEDYDATA:json_message}"]}
       add_field => [ "received_at", "%{@timestamp}" ]
       add_field => [ "received_from", "%{host}" ]
    }
    if [json_message] {
      json {
        source => "json_message"
        target => "parsed_json_message"
        }
    }
  }
  
}

output {
  elasticsearch{
    hosts => ["https://xxxx.io:9243"] 
    user => "elastic"
    password => "xxxx"
    index => "filebeat-logs"
  }
  stdout { 
    codec => "rubydebug"
   }
  file {
    path => "/tmp/logstash-output.txt"
  }
}

Update: Found out that the JSON filter is dropping messages containing the same @timestamp. Any solution to this ?

Related: Setting "target" and "source" to "message" silently drops events · Issue #34 · logstash-plugins/logstash-filter-json · GitHub

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.