Logstash how to handle exceptions

i have a problem in managing exceptions in filter logstash.
I have to define some dates which don't always show the same format and this causes the following error:

Ruby exception occurred: invalid strptime format - `%d/%m/%Y %H.%M.%S %Z' {:level=>:error}

The logstash filter is :
if [DATA] {
ruby {
code => "
event['DATA'] = Time.strptime(event['DATA']+' Europe/Rome', '%d/%m/%Y %H.%M.%S %Z').gmtime.strftime('%Y-%m-%dT%H:%M:%S.%LZ')

Not only it does return an exception, but sometimes Logstash doesn't load data on Elastic Search anymore.

How i can handle the exception ?

thank you

Would you not be able to use the date filter with an array of possible date formats?

the format could be different each time and it could be difficult to expect all of them

Hi ,
thanks for the advice, could you please show me for instance how to apply the date filter supposing that the format could be
'%d/%m/%Y %H.%M.%S %Z'
'%d/%m/%Y %H:%M:%S %Z'

thank you

If you have multiple possible separators, you can try to normalize them using a mutate gsub filter. This will reduce the number of patterns you need to match. Can you provide actual exemples of the date panterns you want to match?

the dates are for example:

18/12/2015 17.53.05 or 18/12/2015 17:53:05

and the expected result is:

2015-12-18 T17:53:05.000Z

What have you tried so far?

the gsub solution doesn't seem to be working. i've tried to use date filter but it doesn't work either. maybe i am doing something wrong

Per the docs you should be able to use something like;

match => [ "datefield", "d/m/Y H.M.S Z", "d/m/Y H.M:S Z" ]

Sorry, but it doesn't work either.

this is my csv input :
TTM000005718043 ; 11/12/2015 09.42.12;
TTM000006099454 ; 11/12/2015 12:06:08;
TTM000006097855 ; 11/12/2015 13.22.56;
TTM000001111111 ; 11/12/2015 13:24:14;

thisi is my filter of config file for logstash:
filter {

if [type] == "test" {
	csv {
		columns => ["ID","DATE"]
		separator => ";"
	date {
		match => ["DATE", "d/m/Y H.M.S Z", "d/m/Y H.M:S Z", "ISO8601"]


where I'm wrong?


match => ["DATE", "dd/MM/YYYY HH.mm.ss Z", "dd/MM/YYYY HH.mm.ss Z", "ISO8601"]

The format is the same.


Since you don't have a timezone at the end remove the " Z" too. Then it works with the example you provided:

$ cat test.config 
input { stdin { } }
output { stdout { codec => rubydebug } }
filter {
  date {
    match => ["message", "dd/MM/YYYY HH.mm.ss"]
$ echo '11/12/2015 09.42.12' | /opt/logstash/bin/logstash -f test.config
Logstash startup completed
       "message" => "11/12/2015 09.42.12",
      "@version" => "1",
    "@timestamp" => "2015-12-11T08:42:12.000Z",
          "host" => "lnxolofon"
Logstash shutdown completed

but from this format 11/12/2015 09.42.12 must become this format 2015-12-11T09:42:12.000Z
I tried to remove the time zone but does not change the situation

If your timestamp already is in UTC and you don't want the date filter to interpret them as local time, use the filter's timezone option:

date {
  timezone => "UTC"

Nothing changes....

sorry but I do not understand , I try to re-explain...

these are the input record of my csv:

TTM000005718043;2015/11/12 09.42.12;
TTM000006099454;2015/11/12 00:06:08;
TTM000006097855;2015/11/12 13.22.56;
TTM000001111111;2015/11/12 13:24:14;

thisi is my config file for logstash:
input {
file {
path => "input/test_*.csv"
start_position => "beginning"
type => "test"
sincedb_path => "work/.sincedb_test"
codec => plain {
charset => "ISO-8859-1"
filter {
if [type] == "test" {
csv {
columns => ["ID","DATE"]
separator => ";"
date {
match => ["DATE", "dd/MM/YYYY HH.mm.ss", "dd/MM/YYYY HH:mm:ss", "ISO8601"]
add_field => [ "key", "%{ID}"]
output {
elasticsearch {
host => "my_host"
cluster => "my_cluster"
index => "my_index"
document_type => "%{type}"
document_id => "%{key}"
template => "config/template.json"

This is the output I want on ES:
TTM000005718043; 2015-12-11T09:42:12.000Z
TTM000006099454; 2015-12-11T00:06:08.000Z
TTM000006097855; 2015-12-11T13:22:56.000Z
TTM000001111111; 2015-12-11T13:24:14.000Z

Where is the error?

We have your configuration and the expected result, but what is the current result from your configuration?

You seem to have a space a the start of the date column, which your date pattern does not seem to account for. You can remove this through a mutate strip filter. I would also recommend using the stdout filter with ruby debug codec while troubleshooting this issue.

yes , sorry....
This is the current result:
"ID" => "TTM000001111111",
"DATE" => "11/12/2015 13:24:14"

the space is not there, I was wrong to bring the content.
and i already use a rubydebug