Hi all. I'm trying to figure out how to convert an epoch timestamp (in seconds.milliseconds format) into a date/time format in Elasticsearch. Here's an example of the log line I'm trying to parse:
8 - {8249} [1508745765.02767] Execution Time: 0.671
I've already set up the following grok processor on the ingest pipeline:
{
"description": "Grok Transaction Times From HttpLog",
"processors": [
{
"grok": {
"field": "message",
"patterns": ["\\A%{NUMBER:server_id} - \\{%{NUMBER:p_id}} \\[%{NUMBER:epoch_timestamp}] Execution Time: %{NUMBER:exec_time}"]
}
}
]
}
I found this post that seems to describe a similar problem, but I'm having difficulty figuring out how to fit this to my own pattern. I have come to the conclusion that I need to multiply the value by 1000 to get a new value that I can then convert with the UNIX_MS format, but how do I define that in the pipeline?
(Disclaimer: we are not using Logstash for various reasons that aren't open to discussion right now. It's an option for the future but only as a last resort.)