Data Lost while import data in elastic search using logstash

I am using logstash 5.5.0 and elastic search 5.5.0.
I am trying to import data from txt file with | separator to elastic search.
The total number of records in txt file are 2139406 approx but while importing the data to elastic search roughly 1795821 records are imported while remaining records are lost.

Then I tried to import the file though the same logstash config file to txt file rather than elastic search and found no records are missed/lost.

eg
output {
if "insertupdate" in [tags] {
# write events that didn't match to a file
file { "path" => "*********/grok_failures.txt" }
}

stdout { codec => rubydebug { metadata => true } }
}

instead of ES
output {
if "insertupdate" in [tags]
{
elasticsearch {
hosts => ["***********"]
index => "test143"
action => "update"
doc_as_upsert => true
document_id => "%{entityId}"
document_type=> "client"
}
}

Hence please help me to resolve the same and what configuration i need to do for logstash and elastic search.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.