I have a set of 100 json messages that i send through Logstash Json filter.
Each and every time I see a variable number of messages persisted to output file, anywhere between 60 to 100.
Things I've tried so far:
Upgrade to 5.5 and enable dead_letter_queue (no messages are written to this queue, folder is empty). I don't see anything here even when I send an invalid json message!
Applied a tag on failure called "failed" and tried to write those messages to a new file. That file remains empty too.
look for messages with tag _jsonparsefailure. No messages show up with that either.
On 10 consecutive runs of the exact same file of 100 messages, the output file had following number of messages:
69, 93, 97, 99, 100, 87, 64, 100, 92, 91
Versions of Logstash I've tried so far: 5.4.1 and 5.5.2
The only complexity to a regular use-case is one of the fields in json is a json object, escaped. I can provide a sample via email, cannot post here.
Please help me debug this as adding tags, writing to DLQ all failed.
A simpler set of data (200 messages) with no other nested complications.... that I repeatedly run through the json filter.
At the end of 20 consequtive runs, I expect 4000 messages in output.log locally where the output filter writes to. But I consistently see a lower count.
No error messages in the Logstash log, nothing showing up in --debug either. But the behavior is consistent.
Is there an older version of Logstash that we could use without the above reported issue? Any work-around or triage on this problem will really help. We are blocked at QA and need to ensure 100% throughput to meet our SLA.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.