Grok parsing timestamp with 2 fields

Hi,
I got this log which has 2 fields of time stamp. How would I go about parsing it, couldn't find any examples online !

{"type": "GreatLog", **"date": "10/3/2021", "time": "6:21:35 AM"**, "message": "Take a Measurement ", "data": "<94;1;0;0;97;168;19;136;0;52;(0)><134217728>", "clientntId": "959", "ddid": "D9-9999-004F"}

I would suggest a json filter to parse the JSON, then mutate+add_field to combine the date and time fields using sprintf references, then a date filter. There are many, many examples of each of those in this forum.

@Badger Many thanks to you, I wasn't searching in the right place.
I didn't know parsing was a first step, I thought it was done automatically.
I did try this example but wasn't sure about the syntex

if "GW" in [path] {

    mutate { add_field => { "[@metadata][ts]" => "%{date} %{time}" } }

    date { match => [ "[@metadata][ts]", "%{DATE_US:date} hh:mm:ss a" ] }

  }

How does the [ts] relate to the @metadata?

Cheers

The [@metadata] field contains fields that are visible in the pipeline, but are ignored by the outputs, so it is useful to store interim results whilst processing events.

Can you show me how it would be done so I can revers-engineer it in order to understand?
{"type": "GreatLog", "date": "10/3/2021", "time": "6:21:35 AM", "message": "Take a Measurement ", "data": "<94;1;0;0;97;168;19;136;0;52;(0)><134217728>", "clientntId": "959", "ddid": "D9-9999-004F"}

Try this config:

filter {
    json {
        source => "message"
    }
    mutate {
        add_field => { "[@metadata][ts]" => "%{date} %{time}" }
    }
    date {
        match => ["[@metadata][ts]", "M/d/yyyy h:mm:ss a"]
    }
}

It will give an output like this:

{
      "@version" => "1",
          "host" => "logstash-host-name",
          "date" => "10/3/2021",
          "ddid" => "D9-9999-004F",
    "clientntId" => "959",
    "@timestamp" => 2021-10-03T06:21:35.000Z,
          "time" => "6:21:35 AM",
       "message" => "Take a Measurement ",
          "type" => "GreatLog",
          "data" => "<94;1;0;0;97;168;19;136;0;52;(0)><134217728>"
}

Also, the @timestamp field will always be in UTC, if your original date is not in UTC you will need to use the option timezone in the date filter to tell logstash the timezone of your date.

Thank you very much !!! I am still not sure how the syntax works.
does [@metadata][ts] means concatenation?
is [ts] just a short name for timestamp ?

It is just a random field name, @metadata is a json object and ts is a field inside this json object, you can use anything.

For example:

{ 
   "@metadata": {
        "ts": "your-value",
        "anything": "another-value"
    }
}

This means that you have two fields, @metadata.ts and @metadata.anything, when using those fields in logstash filter will need to use the square brackets to access them, like [@metadata][ts] and [@metadata][anything].

@leandrojmp
Thank you so much for taking the time and clearing it out for me !

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.