Error to map date field with format yyyy-MM-dd HH:mm:ss.SSS

Hello,
can anyone help me to map the date, 2018-12-03 06:00:00.000

I have tried to give mapping to load into logstash as
yyyy-MM-dd HH:mm:ss.SSS

but getting below error
"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [updated_date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "2018-11-21T18:30:00.000Z" is malformed at "T18:30:00.000Z""}

Thanks inadvance

It seems that your date is not 2018-12-03 06:00:00.000 but 2018-12-03T06:00:00.000 ?

If yes, then your issue is mentioned in the documentation: https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html

For non-formatting syntax, you’ll need to put single-quote characters around the value. For example, if you were parsing ISO8601 time, "2015-01-01T01:12:23" that little "T" isn’t a valid time format, and you want to say "literally, a T", your format would be this: "yyyy-MM-dd’T'HH:mm:ss"

So, the format should be: yyyy-MM-dd'T'HH:mm:ss.SSS.

If this didn't help, I need to see the input data, Logstash config and the error message you received.

Hello, thanks for reply
the above format didnt work for me.
The sample date format present in database for column updated_date is as shown
2017-12-18 00:00:00.000

Here is the input code,

input {

jdbc {
jdbc_driver_library => "D:/Softwares/logstash-6.4.2/lib/sqljdbc4-2.0.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://:/***"
jdbc_user => "####"
jdbc_password => "###"
statement => "SELECT id, name , updated_date FROM sample"
jdbc_paging_enabled => "true"
jdbc_page_size => "50000"
}

}

output{
elasticsearch { codec => json hosts => ["localhost:9200"] index => "idx_sqldata" }
stdout { codec => rubydebug }
}

Output

Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"idx_sqldata", :_type=>"_doc", :_routing=>nil}, #LogStash::Event:0x613cc85f], :response=>{"index"=>{"_index"=>"idx_sqldata", "_type"=>"_doc", "_id"=>"bcDpnGcBu0SneeiwZGsB", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [updated_date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "2018-11-21T18:30:00.000Z" is malformed at "Z""}}}}}

As the log states, it is a mapping problem with Elasticsearch. What kind of mapping are you using for the idx_sqldata index?
This means that the updated_date field has been mapped to a different format than the one you are trying to pass to it.

Hello,

I am using the format as shown below,

"last_updated_date":{
"type":"date",
"format" : "yyyy-MM-dd'T'HH:mm:ss.SSS"
}

The mapping you gave is for "last_updated_date" and the error message indicates that the field is "updated_date".

thanks for the reply,
i have applied the same for mapping field "updated_date" and the result is the same

And you recreated the index or reindexed the old data so the mapping gets applied?

Well, anyway the problem is the charater 'Z' at the end.
As it says:

"reason"=>"Invalid format: "2018-11-21T18:30:00.000Z" is malformed at "Z"

Why don't you post what does the rubydebug actually say about the event, maybe that will reveal what you are really trying to insert into the ES :slight_smile:

the output is as follows,

Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"idx_sqldata", :_type=>"_doc", :_routing=>nil}, #LogStash::Event:0x613cc85f], :response=>{"index"=>{"_index"=>"idx_sqldata", "_type"=>"_doc", "_id"=>"bcDpnGcBu0SneeiwZGsB", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [updated_date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "2018-11-21T18:30:00.000Z" is malformed at "Z""}}}}}

Yes, I saw it already from your previous post.

But what I need to see is the event what you are trying to insert into Elasticsearch. You already have the config there in the output section, stdout with rubydebug.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.