I was sending gz logs to elasticsearch by logstash. The elasticsearch's disk was full but i leave it, so that the logstash reported many 403 errors.
But some day after that, the logstash crashed, and some of my gz logs was diappeared. The crash did not generate any hprof files or any other dumps.
I configed the gz logs path to the soft links of its parent folders.
I think, no matter how logstash crashed, it should not delete my raw logs.
Logstash last line log:
[logstash.outputs.elasticsearch] Retrying individual bulk actions that failed or were rejected by the previous bulk request. {:count=>1000}
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.