Hey everyone, I have been trying to fix this error of duplicate entries into Elasticsearch through logstash
Below is the example of the two entries
Entry 1:
{
"_index": "test-index",
"_type": "doc",
"_id": "3044350386",
"_version": 2,
"_score": 1,
"_source": {
"message": "same message",
"log_type": "registration log",
"@version": "1",
"path": "eg.log",
"@timestamp": "2023-03-15T07:43:17.019Z",
"host": "same host"
}
}
Entry 2:
{
"_index": "test-index",
"_type": "doc",
"_id": "nEg55IYBsSh0OlkJJ8nh",
"_version": 1,
"_score": 1,
"_source": {
"@version": "1",
"@timestamp": "2023-03-15T07:43:16.335Z",
"path": "eg.log",
"host": "same host",
"message": "same message"
}
}
I have tried using fingerprint also but still there are two entries one with the id from fingerprint and other is the autogenerated id
Any help would be appreciated
Please share your Logstash configuration.
1 Like
Thanks for the reply. The problem was related to multiple logstash processes, after I killed the processes the duplication was stopped
Duplicates data can be handle by finderprint. Check the blog.
However, as Leandro said, share .conf and data sample in case you still have problems.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.