How to change all dates to a supported format


#1

I'm sending JSON to logstash and it has about a dozen fields with dates in this format:
yyyy-MM-dd HH:mm:ss

That date format generates an error because it is missing a "T" at the start of the time. The error logstash gives is:

[2017-05-28T10:47:34,167][WARN ][logstash.outputs.elasticsearch] Failed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"myindex", :_type=>"mytype", :_routing=>nil}, 2017-05-28T14:47:33.630Z myhost.local %{message}], :response=>{"index"=>{"_index"=>"myindex", "_type"=>"mytype", "_id"=>"AVxPhs46KqpnKtvQdflL", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [the.json.name]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "2017-05-24 02:00:01" is malformed at " 02:00:01""}}}}}

How do I change all dates in the above format to yyyy-MM-dd'T'HH:mm:ss so that logstash will import them? These fields are not the timestamp field (that is another field and it is in a logstash-acceptable format). Since they are not the document timestamp, from what I understand, I don't use filter { date {...}}, but what do I use?

Thanks for the help.


(Thiago Souza) #2

Hello,

You would need to apply the date filter for each field (you can set a different target field other then @timestamp).

Another option would be changing the mapping in Elasticsearch so it also supports this format (if you do this, keep in mind that your date format does not contains timezone, so Elasticsearch will use server's timezone)

Regards


(system) #3

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.