Logstash continuously output the following error when the clock hit February 29th, 2016:
{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [timestamp]", "caused_by"=>{"type"=>"illegal_field_value_exception", "reason"=>"Cannot parse "Feb 29 00:17:01": Value 29 for dayOfMonth must be in the range [1,28]"}}}}, :level=>:warn}
This is causing my logstash/elasticsearch host to run out of space as the log files grew out of control.
Is there a workaround for this? Can I configure logstash and elasticsearch to ignore this error message?
Thanks for your help @warkolm. After I updated my logstash config file per your suggestion, I'm no longer seeing the erro 'Value 29...". However, I'm now seeing a ton of these errors in my Elasticsearch log:
Mar 1 02:44:13 logstash elasticsearch[20315]: MapperParsingException[failed to parse [timestamp]]; nested: IllegalArgumentException[Invalid format: "Feb 29 22:29:16 2016" is malformed at " 2016"];
I have update my elasticsearch-template.json file so that the timestamp field now has the following entry:
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.