I have some test logs in json format. The number of entries are 900+ in the file I'm testing. After a fresh installation, Filebeat starts sending the logs but after some time (a few minute) starts giving this err:
ERR Failed to publish events: temporary bulk send failure
Enabling debug logs, shows this :
DBG Bulk item insert failed (i=13, status=500): {"type":"string_index_out_of_bounds_exception","reason":"String index out of range: 0"}
The final count of documents in the index created in elasticsearch is 631, less than the 900+ in log file (Untitled-2.json in the logs attached). It has sometimes stopped as early as 100s.
The debug logs and the filebeat configuration are attached in this gist:
Thanks for reporting. I wonder if the error happens when processing event from the syslog module or due to the json prospector. Either way, it might be a bug.
Can you test with only the syslog and only the json prospector being enabled? If one or the other fails can you try to reduce the log files, until we find a few events causing the issue?
I updated the gist to include the elasticsearch logs. I didn't see any ERR in there. I did check with the json prospector disabled, and there were no errors. I'll reduce the events and try again.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.