Ingest Grok Pipeline - Unix Timestamps

Hi all. I'm trying to figure out how to convert an epoch timestamp (in seconds.milliseconds format) into a date/time format in Elasticsearch. Here's an example of the log line I'm trying to parse:

8 - {8249} [1508745765.02767] Execution Time: 0.671

I've already set up the following grok processor on the ingest pipeline:

{
  "description": "Grok Transaction Times From HttpLog",
  "processors": [
    {
      "grok": {
        "field": "message",
        "patterns": ["\\A%{NUMBER:server_id} - \\{%{NUMBER:p_id}} \\[%{NUMBER:epoch_timestamp}] Execution Time: %{NUMBER:exec_time}"]
      }
    }
  ]
}

I found this post that seems to describe a similar problem, but I'm having difficulty figuring out how to fit this to my own pattern. I have come to the conclusion that I need to multiply the value by 1000 to get a new value that I can then convert with the UNIX_MS format, but how do I define that in the pipeline?

(Disclaimer: we are not using Logstash for various reasons that aren't open to discussion right now. It's an option for the future but only as a last resort.)

Does a date processor using UNIX (rather than UNIX_MS) do the job for you?

Possibly. The only examples I've seen up to this point used UNIX_MS, but now I see that UNIX is an option, and I'm presuming it means seconds since 1/1/70 instead of milliseconds.

I figured it out. Here's the solution:

{
  "description": "Grok Transaction Times From HttpLog",
  "processors": [
    {
      "grok": {
        "field": "message",
        "patterns": ["\\A%{NUMBER:server_id} - \\{%{NUMBER:p_id}} \\[%{NUMBER:trans_timestamp}] Execution Time: %{NUMBER:exec_time}"]
      }
    },
    {
      "date":{
        "field":"trans_timestamp",
        "formats":["UNIX"]
      }
    }
  ]
}

Just took me a little while to properly understand the order of operations with regard to adding additional processors.