Logstash continuously output the following error when the clock hit February 29th, 2016:
{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [timestamp]", "caused_by"=>{"type"=>"illegal_field_value_exception", "reason"=>"Cannot parse "Feb 29 00:17:01": Value 29 for dayOfMonth must be in the range [1,28]"}}}}, :level=>:warn}
This is causing my logstash/elasticsearch host to run out of space as the log files grew out of control.
Is there a workaround for this? Can I configure logstash and elasticsearch to ignore this error message?
Best Regards,