I am importing around 500 CSV's into "test" bucket every hourly, But when i verify data/hits into Kibana, Half of logs are missing, When i checked CSV's into S3, All CSV's are exits, Also Logstash script is running correctly without any error, I also reviewed sincedb_path => 'C:/Test/logstash/test.sincedb' and its saves last process CSV time,
Logstash import File 1 to File 90 CSV's, Then Gap File 91 to 100 CSV's,
Please find the Logstash input code and Kibana visualization,
Logstash input section as below,
access_key_id => "XXXXXXXXXXXXXXXXXX"
secret_access_key => "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
region => "us-east-1"
bucket => "test"
prefix => "Logstashfolder/test/"
delete => false
interval => 10 # seconds
sincedb_path => 'C:/Test/logstash/test.sincedb'
Kibana View for T1 file Last 24 hours
Note: CSV's are exits into S3 between 5 AM to 12 PM, Logstash missed this CSV's to import into ES, If i manually upload this files/CSV's between 5 AM to 12 PM into S3 bucket then Logsatsh will import it,
Kibana View for T2 file Last 24 hours
Can someone please let me know whats the actual issue and how to resolve it ? Or Its Logstash performance issue ?