Logstash not sending to ES when date parsing fails

Hi

I have an epoch timestamp which I want to get into the @timestamp field in Elasticsearch/Kibana. When I use the code:

date {
match => [ "slowlog_timestamp","UNIX" ]
remove_field => [ "slowlog_timestamp" ]
}

I get a _dateparsefailure and nothing is to be send to Elasticsearch/Kibana. When I use the code:

date {
match => [ "slowlog_timestamp","UNIX" ]
remove_field => [ "slowlog_timestamp" ]
target => "myTime"
}

I get no _dateparsefailure and the event ends up in Elasticsearch/Kibana (see screenshot).

Anybody any ideas?

Thanks in advance.

Regards
Davy

What does the raw message that results in _dateparsefailure look like? I'm interested in what the slowlog_timestamp field looks like.

Hi Magnus

Thanks for your response. After a lot of debugging I finally found the root cause of my problem. Because I was using (old) logdata from a production server over and over again and because I was modifying the timestamp field, Elasticsearch putted the shipped data in another (older) index then the one (current) I was looking at (really pebkac isn't it).

The _dateparsefailure mentioned was also caused by me because I was trying to do incorrect conversions in an effort to debug the problem but looking in the wrong the direction.

Thanks again and sorry for (wasting) your time.

Regards