Hello everyone. I have a problem with logstash and Elasticsearch. I would like to explain my problem with an example. I have two conf files, first example_A.conf and second example_B.conf. I am loading data from example_A.conf file with logstash to Elasticsearch. The example_A.conf file loads successfully. Then I load example_B and all data in example_A is deleted. The total count of the index does not change either, but there is no error while loading. Shard number of index is 0 and size = 537.4gb . If anyone has an idea what the problem could be and share it with me I would be very happy.
You will need to share both the A and B conf files And perhaps sample data.
If I remember you were working with the fingerprint filter before to me it sounds like perhaps you're overwriting the data with the document ID but that's without seeing any actual configuration or data So that's just a guess.
Show us your configurations and the data and perhaps we can help.
Hello @stephenb , the problem was in the fingerprint as you said. I solved the problem. Actually there are over 1.5 billion documents in the index and the index has only one shard so I thought the problem was in the shards or in the memory. I didn't think it would overwrite. Thank you so much for reminding me of this. You helped me a lot to improve myself on Elasticsearch and logstash, thank you for everything and all help
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.