Invalid timestamp format, but the date is formed correctly

My logstash:

...
date {
	locale => "en"
	match => ["date", "dd.MM.YYYY;HH:mm:ss"]
	timezone => "Europe/Ljubljana"
	target => "date"
}
...

I've proced with this 352327606 records. 352327595 were ok, but 11 thrown an error:

[2017-12-02T21:38:52,855][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"transakcije", :_type=>"log_transakcije", :_routing=>nil}, #<LogStash::Event:0x61d3361>], :response=>{"index"=>{"_index"=>"transakcije", "_type"=>"log_transakcije", "_id"=>"IIfzGGABRrLTAyiQ8Z6L", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"29.03.2009;02:42:00\" is malformed at \".03.2009;02:42:00\""}}}}}
[2017-12-02T21:38:52,920][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"transakcije", :_type=>"log_transakcije", :_routing=>nil}, #<LogStash::Event:0x5ae31e28>], :response=>{"index"=>{"_index"=>"transakcije", "_type"=>"log_transakcije", "_id"=>"IYfzGGABRrLTAyiQ8Z6L", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"29.03.2009;02:42:01\" is malformed at \".03.2009;02:42:01\""}}}}}
[2017-12-02T21:38:52,935][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"transakcije", :_type=>"log_transakcije", :_routing=>nil}, #<LogStash::Event:0x657bbaba>], :response=>{"index"=>{"_index"=>"transakcije", "_type"=>"log_transakcije", "_id"=>"IofzGGABRrLTAyiQ8Z6L", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"29.03.2009;02:42:02\" is malformed at \".03.2009;02:42:02\""}}}}}
[2017-12-02T22:50:12,228][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"transakcije", :_type=>"log_transakcije", :_routing=>nil}, #<LogStash::Event:0x449a80fc>], :response=>{"index"=>{"_index"=>"transakcije", "_type"=>"log_transakcije", "_id"=>"qeI1GWABRrLTAyiQTRT-", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"29.03.2009;02:09:00\" is malformed at \".03.2009;02:09:00\""}}}}}
[2017-12-03T03:26:53,775][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"transakcije", :_type=>"log_transakcije", :_routing=>nil}, #<LogStash::Event:0x1b1b2b88>], :response=>{"index"=>{"_index"=>"transakcije", "_type"=>"log_transakcije", "_id"=>"bCEyGmABRrLTAyiQln2z", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"29.03.2009;02:12:00\" is malformed at \".03.2009;02:12:00\""}}}}}
[2017-12-03T07:35:51,399][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"transakcije", :_type=>"log_transakcije", :_routing=>nil}, #<LogStash::Event:0x43a31561>], :response=>{"index"=>{"_index"=>"transakcije", "_type"=>"log_transakcije", "_id"=>"8jYWG2ABRrLTAyiQhP4I", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"29.03.2009;02:08:00\" is malformed at \".03.2009;02:08:00\""}}}}}
[2017-12-03T08:30:55,304][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"transakcije", :_type=>"log_transakcije", :_routing=>nil}, #<LogStash::Event:0x5b75f6a>], :response=>{"index"=>{"_index"=>"transakcije", "_type"=>"log_transakcije", "_id"=>"DYpIG2ABRrLTAyiQ9m2h", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"29.03.2009;02:03:00\" is malformed at \".03.2009;02:03:00\""}}}}}
[2017-12-03T08:30:55,304][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"transakcije", :_type=>"log_transakcije", :_routing=>nil}, #<LogStash::Event:0x234fc3e0>], :response=>{"index"=>{"_index"=>"transakcije", "_type"=>"log_transakcije", "_id"=>"DopIG2ABRrLTAyiQ9m2h", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"29.03.2009;02:06:00\" is malformed at \".03.2009;02:06:00\""}}}}}
[2017-12-03T08:30:55,305][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"transakcije", :_type=>"log_transakcije", :_routing=>nil}, #<LogStash::Event:0x7f1a60b9>], :response=>{"index"=>{"_index"=>"transakcije", "_type"=>"log_transakcije", "_id"=>"D4pIG2ABRrLTAyiQ9m2h", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"29.03.2009;02:07:00\" is malformed at \".03.2009;02:07:00\""}}}}}
[2017-12-03T09:36:50,047][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"transakcije", :_type=>"log_transakcije", :_routing=>nil}, #<LogStash::Event:0x3adde0ad>], :response=>{"index"=>{"_index"=>"transakcije", "_type"=>"log_transakcije", "_id"=>"dvaFG2ABRrLTAyiQT_kK", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"31.03.2013;02:06:00\" is malformed at \".03.2013;02:06:00\""}}}}}
[2017-12-03T15:03:57,910][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"transakcije", :_type=>"log_transakcije", :_routing=>nil}, #<LogStash::Event:0x6c159c4e>], :response=>{"index"=>{"_index"=>"transakcije", "_type"=>"log_transakcije", "_id"=>"LcywHGABRrLTAyiQzki8", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"29.03.2009;02:02:00\" is malformed at \".03.2009;02:02:00\""}}}}}

Could someone advise me, how to fix this?

The date filter clearly didn't succeed for these events. I'm not sure why; your configuration looks correct.

Thanks @magnusbaeck.

I found this out by myself. But I cannot find out why.
I'm using LS 6.0.0.

In version prior to 6.0.0. there was also the input message in the error log line. Do you know how to turn this back on?

Here's my message:

|cir=C2|date=29.03.2009 02:42:01|acronym=CEKLJ|libraryCode=50005|user=EZPROXY|type=57|transactionHostDepartment=01|membIdentificNumb=0051022|patronId=0051022|patronCategory=020|lastVisitDate=28.03.2009|schoolName=19|schoolDept=1|libraryDept=00|firstsignUpDate=09.11.2004

I'm using

mutate {
		gsub => ["date", " ", ";"]
		convert => {"type" => "integer"}
		convert => {"rptPackageStatus" => "integer"}
		add_field => {"country" => "si"}
	}

To alter date before.

Tried it again. After grep-ed through log files to find this entry.
The error persists.

I'm curious why only march and grater than 28. Could he been thinking he's in february?

What does the rest of your config look like? Have you used grok and/or a kv filter to parse out the date string into a date field? What does the event look like if you output it using a stdout output plugin with a rubydebug codec?

Here's my full config:

# The # character at the beginning of a line indicates a comment. Use
# comments to describe your configuration.
input {
	beats {
        port => "5044"
    }
}
# The filter part of this file is commented out to indicate that it is
# optional.
filter {
	mutate {
		gsub => ["message", "\|C3\|", "|cir=C3|"]
	}

#	kv {
#		field_split => "|"
#		include_brackets => false
#	}
	ruby {
		code => "
			a = event.get('message').split('|').delete_if{|x| !x.match(/=/)}
			a.each {|y| b = y.split('=', 2)
				event.set(b[0].strip, b[1])
			}
			event.set('acronym', event.get('acronym').upcase)"
	}
	mutate {
		gsub => ["date", " ", ";"]
		convert => {"type" => "integer"}
		convert => {"rptPackageStatus" => "integer"}
		add_field => {"country" => "si"}
	}
	date {
		locale => "en"
		match => ["date", "dd.MM.YYYY;HH:mm:ss"]
		timezone => "Europe/Ljubljana"
		target => "date"
	}
	date {
		locale => "en"
		match => ["returnDate", "dd.MM.YYYY"]
		timezone => "Europe/Ljubljana"
		target => "returnDate"
	}
	date {
		locale => "en"
		match => ["firstsignUpDate", "dd.MM.YYYY"]
		timezone => "Europe/Ljubljana"
		target => "firstsignUpDate"
	}
	date {
		locale => "en"
		match => ["lastVisitDate", "dd.MM.YYYY"]
		timezone => "Europe/Ljubljana"
		target => "lastVisitDate"
	}
	date {
		locale => "en"
		match => ["loanDate", "dd.MM.YYYY"]
		timezone => "Europe/Ljubljana"
		target => "loanDate"
	}
	date {
		locale => "en"
		match => ["lastProlongDate", "dd.MM.YYYY"]
		timezone => "Europe/Ljubljana"
		target => "lastProlongDate"
	}
	date {
		locale => "en"
		match => ["reservationDate", "dd.MM.YYYY"]
		timezone => "Europe/Ljubljana"
		target => "reservationDate"
	}
}
output {
	elasticsearch {
        hosts => [ "localhost:9200" ]
        index => "transakcije"
		document_type => "log_transakcije"
    }
#	stdout { codec => rubydebug }

}

I'll try now to use rubydebug and let you know.

Here's my rubydebug:

[2017-12-04T10:16:43,478][INFO ][logstash.pipeline        ] Pipeline started {"pipeline.id"=>"main"}
[2017-12-04T10:16:43,509][INFO ][logstash.agent           ] Pipelines running {:count=>1, :pipelines=>["main"]}
[2017-12-04T10:16:43,525][INFO ][org.logstash.beats.Server] Starting server on port: 5044
{
                         "date" => "29.03.2009;02:42:01",
                  "libraryDept" => "00",
                      "country" => "si",
                       "source" => "e:\\data\\try_again.txt",
                         "type" => 57,
                          "cir" => "C2",
               "patronCategory" => "020",
    "transactionHostDepartment" => "01",
                     "@version" => "1",
                         "beat" => {
            "name" => "elastic",
        "hostname" => "elastic",
         "version" => "6.0.0"
    },
                         "host" => "elastic",
                   "schoolName" => "19",
              "firstsignUpDate" => 2004-11-08T23:00:00.000Z,
                       "offset" => 272,
                      "acronym" => "CEKLJ",
                     "patronId" => "0051022",
                   "schoolDept" => "1",
                   "prospector" => {
        "type" => "log"
    },
                  "libraryCode" => "50005",
            "membIdentificNumb" => "0051022",
                      "message" => "|cir=C2|date=29.03.2009 02:42:01|acronym=CEKLJ|libraryCode=50005|user=EZPROXY|type=5
7|transactionHostDepartment=01|membIdentificNumb=0051022|patronId=0051022|patronCategory=020|lastVisitDate=28.03.2009|sc
hoolName=19|schoolDept=1|libraryDept=00|firstsignUpDate=09.11.2004",
                "lastVisitDate" => 2009-03-27T23:00:00.000Z,
                         "tags" => [
        [0] "beats_input_codec_plain_applied",
        [1] "_dateparsefailure"
    ],
                   "@timestamp" => 2017-12-04T09:17:25.178Z,
                         "user" => "EZPROXY"
}

When the date filter fails it should log a message that indicates why.

There was no message in console.
As you can see, there was a line with LS starting sequence and after that a stdout output.

Should I turn on something else?

@magnusbaeck

Here the debug with a raised loglevel to debug

[2017-12-04T13:16:47,181][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>
"#<Thread:0x7e46b673@E:/logstash/logstash-core/lib/logstash/pipeline.rb:290 sleep>"}
[2017-12-04T13:16:48,434][DEBUG][io.netty.util.internal.JavassistTypeParameterMatcherGenerator] Generated: io.netty.util
.internal.__matchers__.org.logstash.beats.AckMatcher
[2017-12-04T13:16:48,438][DEBUG][io.netty.util.internal.JavassistTypeParameterMatcherGenerator] Generated: io.netty.util
.internal.__matchers__.org.logstash.beats.BatchMatcher
[2017-12-04T13:16:48,481][DEBUG][org.logstash.beats.BeatsHandler] [local: 0:0:0:0:0:0:0:1:5044, remote: 0:0:0:0:0:0:0:1:
57961] Received a new payload
[2017-12-04T13:16:48,482][DEBUG][org.logstash.beats.BeatsHandler] [local: 0:0:0:0:0:0:0:1:5044, remote: 0:0:0:0:0:0:0:1:
57961] Sending a new message for the listener, sequence: 1
[2017-12-04T13:16:48,515][DEBUG][logstash.pipeline        ] filter received {"event"=>{"@timestamp"=>2017-12-04T12:16:47
.409Z, "offset"=>272, "@version"=>"1", "beat"=>{"name"=>"elastic", "hostname"=>"elastic", "version"=>"6.0.0"}, "host"=>"
elastic", "prospector"=>{"type"=>"log"}, "source"=>"e:\\data\\try_again.txt", "message"=>"|cir=C2|date=29.03.2009 02:42:
01|acronym=CEKLJ|libraryCode=50005|user=EZPROXY|type=57|transactionHostDepartment=01|membIdentificNumb=0051022|patronId=
0051022|patronCategory=020|lastVisitDate=28.03.2009|schoolName=19|schoolDept=1|libraryDept=00|firstsignUpDate=09.11.2004
", "tags"=>["beats_input_codec_plain_applied"]}}
[2017-12-04T13:16:48,528][DEBUG][logstash.util.decorators ] filters/LogStash::Filters::Mutate: adding value to field {"f
ield"=>"country", "value"=>["si"]}
[2017-12-04T13:16:48,544][DEBUG][logstash.pipeline        ] output received {"event"=>{"date"=>"29.03.2009;02:42:01", "l
ibraryDept"=>"00", "country"=>"si", "source"=>"e:\\data\\try_again.txt", "type"=>57, "cir"=>"C2", "patronCategory"=>"020
", "transactionHostDepartment"=>"01", "@version"=>"1", "beat"=>{"name"=>"elastic", "hostname"=>"elastic", "version"=>"6.
0.0"}, "host"=>"elastic", "schoolName"=>"19", "firstsignUpDate"=>2004-11-08T23:00:00.000Z, "offset"=>272, "acronym"=>"CE
KLJ", "patronId"=>"0051022", "schoolDept"=>"1", "prospector"=>{"type"=>"log"}, "libraryCode"=>"50005", "membIdentificNum
b"=>"0051022", "message"=>"|cir=C2|date=29.03.2009 02:42:01|acronym=CEKLJ|libraryCode=50005|user=EZPROXY|type=57|transac
tionHostDepartment=01|membIdentificNumb=0051022|patronId=0051022|patronCategory=020|lastVisitDate=28.03.2009|schoolName=
19|schoolDept=1|libraryDept=00|firstsignUpDate=09.11.2004", "lastVisitDate"=>2009-03-27T23:00:00.000Z, "tags"=>["beats_i
nput_codec_plain_applied", "_dateparsefailure"], "@timestamp"=>2017-12-04T12:16:47.409Z, "user"=>"EZPROXY"}}
{
                         "date" => "29.03.2009;02:42:01",
                  "libraryDept" => "00",
                      "country" => "si",
                       "source" => "e:\\data\\try_again.txt",
                         "type" => 57,
                          "cir" => "C2",
               "patronCategory" => "020",
    "transactionHostDepartment" => "01",
                     "@version" => "1",
                         "beat" => {
            "name" => "elastic",
        "hostname" => "elastic",
         "version" => "6.0.0"
    },
                         "host" => "elastic",
                   "schoolName" => "19",
              "firstsignUpDate" => 2004-11-08T23:00:00.000Z,
                       "offset" => 272,
                      "acronym" => "CEKLJ",
                     "patronId" => "0051022",
                   "schoolDept" => "1",
                   "prospector" => {
        "type" => "log"
    },
                  "libraryCode" => "50005",
            "membIdentificNumb" => "0051022",
                      "message" => "|cir=C2|date=29.03.2009 02:42:01|acronym=CEKLJ|libraryCode=50005|user=EZPROXY|type=5
7|transactionHostDepartment=01|membIdentificNumb=0051022|patronId=0051022|patronCategory=020|lastVisitDate=28.03.2009|sc
hoolName=19|schoolDept=1|libraryDept=00|firstsignUpDate=09.11.2004",
                "lastVisitDate" => 2009-03-27T23:00:00.000Z,
                         "tags" => [
        [0] "beats_input_codec_plain_applied",
        [1] "_dateparsefailure"
    ],
                   "@timestamp" => 2017-12-04T12:16:47.409Z,
                         "user" => "EZPROXY"
}
[2017-12-04T13:16:52,183][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>
"#<Thread:0x7e46b673@E:/logstash/logstash-core/lib/logstash/pipeline.rb:290 sleep>"}

Still no error.
I can just try once more with lvl raised to trace.

It is very strange that it works for the other dates, but not the timestamp. I did a quick test, and it seems to work if the timezone specification is removed, so I wonder if there is a bug with the date filter plugin?

I need timezone so that my data is correct.
I'm also leaning towards a bug in date plugin.

I've tried, with one day +/-, one month + and one year + and all was working.

Should I file a bug on github, or will you or @magnusbaeck?

Feel free to open an issue on Github. Provide config as well as example data so it can be recreated easily.

Here's a link to GitHub, if someone will follow:
Git Hub issue

Found out today that this time does not exist. It's between the DST move from 2 to 3.
But the error message should give more details.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.