Logstash doesn't recognize "Feb 29" as a valid date

Logstash continuously output the following error when the clock hit February 29th, 2016:

{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [timestamp]", "caused_by"=>{"type"=>"illegal_field_value_exception", "reason"=>"Cannot parse "Feb 29 00:17:01": Value 29 for dayOfMonth must be in the range [1,28]"}}}}, :level=>:warn}

This is causing my logstash/elasticsearch host to run out of space as the log files grew out of control.

Is there a workaround for this? Can I configure logstash and elasticsearch to ignore this error message?

Best Regards,

Does your message have a year in the date?
If not use something like this to add it in;

filter {
  mutate {
    replace => ["timestamp", "%{timestamp} 2016"]
  date {
    locale => "en"
    match => ["timestamp", "MMM d HH:mm:ss YYYY", "MMM dd HH:mm:ss YYYY", "ISO8601"]

Thanks for your help @warkolm. After I updated my logstash config file per your suggestion, I'm no longer seeing the erro 'Value 29...". However, I'm now seeing a ton of these errors in my Elasticsearch log:

Mar 1 02:44:13 logstash elasticsearch[20315]: MapperParsingException[failed to parse [timestamp]]; nested: IllegalArgumentException[Invalid format: "Feb 29 22:29:16 2016" is malformed at " 2016"];

I have update my elasticsearch-template.json file so that the timestamp field now has the following entry:

    "timestamp" : {
        "type" : "date",
        "format" : "epoch_millis||dateOptionalTime||MMM dd HH:mm:ss||dd/MMM/YYYY:HH:mm:ss Z||yyyy/MM/dd HH:mm:ss||yyyy/MM/dd||MMM  d HH:mm:ss yyyy||MMM dd HH:mm:ss yyyy"

but it doesn't look like it helps.

Any idea how I can get rid of the error in Elasticsearch?