Filebeat is not able to parse json from files where \n separated json lines (events) are written. It sometimes misses the records and sometimes gives error when traffic is high

Hi Ruffin,

If I transfer the same file again, it is processed successfully without any error.

Filebeat picks already rotated files.

filebeat config :

In filebeat.yml,

filebeat.prospectors:

- input_type: log
paths:
/root/cdrs/*-*.log
json.keys_under_root: true
json.add_error_key: true

ignore_older: 24h
close_inactive: 12h
scan_frequency: 30s
clean_inactive: 48h
clean_removed: true
close_removed: true
close_eof: true

output.logstash:

hosts: ["VALID IP:5043"]
fields_under_root: false

////////////////////////////////////////////////////////////////////////
Logstash Config:

input {
beats {
port => 5043
client_inactivity_timeout => 86400
}
}

filter
{
grok {
match => { "g2uEvent" => "%{TIMESTAMP_ISO8601:g2uEventTime}"}
}

date {
match => ["g2uEventTime", "ISO8601"]
target => "@timestamp"
}

}

output {

amazon_es {
hosts => ["VALID ELASTIC SEARCH END POINT"]
index => "d2c-%{+YYYY-MM-dd}"
}

stdout { codec => rubydebug }
}