As you can see the output is also sent to stdout and there I see thousands of entries flashing by but when I go look at Kibana I only see around 164 entries (with timerange set to past 5 years).
ES logs show no errors, logstash logs show no errors. I would expect to see thousands of entries from all the separate lines in the logfiles. What am I missing?
My guess would be that your document is is not unique or that the field does not exist, causing lots of documents to get the same is an be updated instead of inserted.
If this is the problem you should see updates taking place if you look at the cat indices API. If this is the issue send events to ardour or file and check the fields present to find the error.
Yes, that indicates updates or deletes taking place. To fix this you need to look at your data and fix the document ID field. Another way is to not specify an ID and let Elasticsearch assign one.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.