Getting timestamp inside timestamp

Hi Everyone,

I am new to the Logstash, so please forgive me for asking silly question.

I am trying to convert timestamp from the log file into @timestamp.
Here is my logstash conf file.

input {
    udp {
        port => 5959
    }
}

filter {
    csv {
       separator => "|"
       quote_char => "\x00"
       columns => ["timestamp", "probablecause", "alarmseverity", "servicename", "servername"]
    }
    
    mutate { 
        #remove_field => ["@version", "host", "path"] 
        convert => { "timestamp" => "string" }
    }
    
    date {
        match => [ "timestamp", "yyyy-MM-dd HH:mm:ss", "ISO8601"]
        #yyyy-MM-dd HH:mm:ss
        #target => "@timestamp"
        #timezone => "IST"
    }
}

output {
        elasticsearch { 
                hosts => ["localhost:9200"]
                index => "indexfortesting"
                codec => rubydebug
            }    
}

This is my actual log file content.

2021-10-12 19:22:11|connection-establishment-error|cleared|ER|SAPC_PL-4|
2021-10-12 19:23:12|connection-establishment-error|major|ER|SAPC_PL-4|

Here's my data looks like on the Dashboard.

Please suggest me how to use timestamp from the log file.

Thanks in advance.

I'm not sure how your output timestamp field could get the value you showed, can you change your output to stdout and share what is the output of your pipeline?

Also, you are using quote_char in your csv input, but there is nothing like that the sample you shared, do you really have the NULL character around your csv values?

Considering that your parse is working, and you have the following value in the timestamp field.

timestamp => 2021-10-12 19:22:11

Your date filter is ok to parse this value into the @timestamp field, if you are not in the UTC timezone you will need to specify that, but you cannot use the IST named timezone as this is ambiguous, you will need to pass the time difference in numbers.

For example:

date {
    match => ["timestamp", "yyyy-MM-dd HH:mm:ss"]
    timezone => "+0530"
}

@leandrojmp
The reason being for using quote_char is that I am sending data via Python using Log Handler, so it has no double quotes around it. Current quote_char worked for all the fields except timestamp.

I took your suggestion and added timezone.

date {
        match => [ "timestamp", "yyyy-MM-dd HH:mm:ss"]
        target => "@timestamp"
        timezone => "+0530"
    }

Here's the output on the terminal

I am getting _dateparsefailure, not sure why

Please, do not share images, it can be very hard to see and some people can not even be able to see it.

Can you change your output to stdout { codec => rubydebug } and share the output of one event using the preformatted text?

For what I was able to see in the image your share, your message is not being parsed, your timestamp field is not right, it has an entire json document in it.

@leandrojmp

Here's the output:

{
         "@version" => "1",
    "alarmseverity" => "WA",
             "type" => "5G Logs",
    "probablecause" => "minor",
       "@timestamp" => 2021-10-12T14:47:14.642Z,
      "servicename" => "SAPC_PL-5",
        "timestamp" => "{\"@timestamp\": \"2021-10-12T14:47:14.642Z\", \"@version\": \"1\", \"message\": \"system-resources-overload",
             "tags" => [
        [0] "_dateparsefailure"
    ],
          "message" => "{\"@timestamp\": \"2021-10-12T14:47:14.642Z\", \"@version\": \"1\", \"message\": \"system-resources-overload|minor|WA|SAPC_PL-5|2021-10-13 11:24:05\", \"host\": \"DDC5-L-DFD82B3\", \"path\": \"C:\\\\Users\\\\PRASAN~1.BIY\\\\AppData\\\\Local\\\\Temp/ipykernel_25640/3145687460.py\", \"tags\": [], \"type\": \"logstash\", \"level\": \"INFO\", \"logger_name\": \"5G Logs\", \"stack_info\": null}",
       "servername" => "2021-10-13 11:24:05\", \"host\": \"DDC5-L-DFD82B3\", \"path\": \"C:\\\\Users\\\\PRASAN~1.BIY\\\\AppData\\\\Local\\\\Temp/ipykernel_25640/3145687460.py\", \"tags\": [], \"type\": \"logstash\", \"level\": \"INFO\", \"logger_name\": \"5G Logs\", \"stack_info\": null}",
             "host" => "127.0.0.1"
}

Your csv filter is not working, your original message field is a json, you will need to parse this field, so you will get a new message field with the format that your csv filter is expecting.

Try to add the following filter above your csv filter.

json {
    source => "message"
} 

i'm using Europe/Paris as Timezone (+2)

INPUT DATA :
2021-10-12 19:22:11|connection-establishment-error|cleared|ER|SAPC_PL-4|
2021-10-12 19:23:12|connection-establishment-error|major|ER|SAPC_PL-4|

Filter :

'filter {
  
    grok {
      match          => {
        "message"    => [
          "^%{TIMESTAMP_ISO8601:logdate}%{GREEDYDATA:[@metadata][specificlogcontent]}",
          "%{GREEDYDATA:FAILPARSE}"
        ]
      }
    }
    
    ################### DATE ####################
    # Restore @timestamp with its original value
    if [origtimestamp] {
      mutate {
        convert => [ "origtimestamp", "string" ]
      }
      date {
         target => "@timestamp"
         match => ["origtimestamp", "ISO8601", "yyyy-MM-dd HH:mm:ss", "YYYY-MM-dd HH:mm:ss"]
         #remove_field => [ "origtimestamp" ]
      }
    }

    date {
      match    => [ "logdate", "yyyy-MM-dd HH:mm:ss" ]
      timezone => "Europe/Paris"
      target   => "TIMESTAMP"
    }
  }

OUTPUT :
{
"@metadata": {
"specificlogcontent": "|connection-establishment-error|cleared|ER|SAPC_PL-4|\r"
},
"@timestamp": "2021-10-12T14:46:47.173Z",
"@version": "1",
"TIMESTAMP": "2021-10-12T17:22:11.000Z",
"host": "localhost",
"logdate": "2021-10-12 19:22:11",
"message": "2021-10-12 19:22:11|connection-establishment-error|cleared|ER|SAPC_PL-4|\r"
}
{
"@metadata": {
"specificlogcontent": "|connection-establishment-error|major|ER|SAPC_PL-4|"
},
"@timestamp": "2021-10-12T14:46:47.175Z",
"@version": "1",
"TIMESTAMP": "2021-10-12T17:23:12.000Z",
"host": "localhost",
"logdate": "2021-10-12 19:23:12",
"message": "2021-10-12 19:23:12|connection-establishment-error|major|ER|SAPC_PL-4|"
}

@leandrojmp

I tried your approach. I am getting this data:

{
       "@timestamp" => 2021-10-14T12:51:28.000Z,
             "type" => "logstash",
            "level" => "INFO",
    "alarmseverity" => "cleared",
      "servicename" => "ER",
        "timestamp" => "2021-10-14 18:21:28",
       "servername" => "SAPC_PL-3",
          "message" => "system-resources-overload|cleared|ER|SAPC_PL-3|2021-10-14 18:21:28",
             "host" => "DDC5-L-DFD82B3",
             "path" => "C:\\Users\\PRASAN~1.BIY\\AppData\\Local\\Temp/ipykernel_25640/885417759.py",
         "@version" => "1",
    "probablecause" => "system-resources-overload",
      "logger_name" => "5G Logs",
             "tags" => [],
       "stack_info" => nil
}

Here, I am getting two timestamps i.e., timestamp and @timestamp
Is this expected?

Yes, @timestamp is a field created by logstash when the event enters the filter block of your pipeline, the timestamp field is the name you gave when parsing your message with the csv filter.

Also, the @timestamp field will always be in UTC/GMT.

It seems that your date filter is working now as the difference between the two fields is 05:30, which I think is the difference from your timezone to UTC, assuming that IST means India Standard Time.

Yes, IST is Indian Standard Time.
Thanks for the quick solution @leandrojmp, really appreciated :smiley:

Just a tip, abbreviated timezones, like IST, are not always reliable as they can mean different times.

For example, IST can be the abbreviation of:

  • India Standard Time (UTC + 05:30)
  • Irish Standard Time (UTC + 01:00)
  • Israel Standard Time (UTC + 02:00)

So, try to always use the exact time difference, like +0530 or the name used by IANA in the tz databases, in this case Asia/Kolkata.

I would say that it is better to use Asia/Kolkata instead of +0530, but this will give you the same result.

1 Like

The filter is built on top of Joda, which maintains its own set of timezones. Basically IANA/Olson plus a few others. When a set at the bottom of the stack changes then each of the layers above need a release to incorporate it This can take a while.

The reality is that a lot of places have a ?ST or ?DT, and obviously the ?ST do not get adjusted twice a year to match ?DT, whereas ?DT may or may not be adjust to match the equivalent ?ST. It is a local convention. Do not use them.

Thanks for the tip @leandrojmp. I will take that into consideration :blush:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.