I have to read multiple service logs which contain different time formats to the microseconds precision. Logstash is able to parse all the timestamps from "%{TIMESTAMP_ISO8601:timestamp}
filter but I get 400 for some logs in Elasticsearch because of the various time formats and I am not able to figure our what is wrong with my Kibana format:
"mappings": { "properties": { "timestamp": { "type": "date", "format": "strict_date_optional_time||yyyy-MM-dd['T'][ ][HH:mm:ss][,SSSSSSSSS][.SSSSSSSSS]['Z']||yyyy-MM-dd [HH:mm:ss][,SSSSSSSSS]||yyyy-MM-dd HH:mm:ss.SSS||yyyy-MM-dd HH:mm:ss.SSSSSSSSS||epoch_millis" } } }
I thought this filter yyyy-MM-dd['T'][ ][HH:mm:ss][,SSSSSSSSS][.SSSSSSSSS]['Z']
alone should be able to parse all the available time formats but it fails even for these:
2023-02-25T15:50:53.718970614Z
2023-02-25 15:50:53,718
-
2023-02-25T15:50:53.718Z
Isn't[ ]
supposed to mean optional? So['T']
or[<space>]
both should be checked in the single format? Also, how do I give milliseconds precision to be anywhere between 3 to 9 decimal? Any docs that I can follow for understand the patterns here?
Also, I understand this is not so correct way to push logs but this is what we need right now. We will fix it later.