How to replace @timestamp with actual log time

I am using below grok filter to parse the log ,

`filter {
      grok {
          match => [ "message", "%{TIMESTAMP_ISO8601:timestamp} (\[%{WORD:loglevel}\]) %{DATA} - %{DATA:method} processing time for transactionId : %{WORD:transactionid} documentType : %{WORD:document type} is %{INT:duration:int}"  ]

          match => [ "message", "%{TIMESTAMP_ISO8601:timestamp} (\[%{WORD:loglevel}\]) %{DATA} - %{DATA:method} processing time for transactionId : %{WORD:transactionid} documentType : %{WORD:document type} merchant : %{HOSTNAME:merchant} is %{INT:duration:int}"  ]
 
          match => [ "message", "%{TIMESTAMP_ISO8601:timestamp} %{GREEDYDATA} (\[%{WORD:loglevel}\])" ]
 
    if "beats_input_codec_plain_applied" in [tags] {
        mutate {
            remove_tag => ["beats_input_codec_plain_applied"]
  
    if "_grokparsefailure" in [tags] {
        mutate {
            remove_tag => ["_grokparsefailure"]
     
       mutate { remove_field => [ "host" , "@version" , "source" , "input" ,  "tags" ,  "prospector" , "offset" ] }
      if "monitoring" in [message] or "harvester" in [message] {
       drop {}
  }
 }`

here is sample log

2018-09-20 10:11:10 [INFO] from application in pool-3-thread-20 - Document Authentication IDEAnalysis processing time for transactionId : 6057104998582039_node1 documentType : License merchant : 9a632f34-9cbe-4d5c-8fc9-23fceb263a94 is 8 msec

I am getting different time on @timestamp, can anyone please suggest?

Use a date filter to parse the timestamp field into @timestamp.

I used below date filter but not working

date {
match => [ "timestamp" , "ISO8601" , "yyyy-MM-dd HH:mm:ss" ]
}

So what do you get? Show an example event, e.g. by copy/paste of the raw document from Kibana's JSON tab.

Input log

2018-09-20 10:35:36 917 [DEBUG] from org.mongodb.driver.protocol.command in application-akka.actor.default-dispatcher-383 - Sending command {update : BsonString{value='accounts'}} to database dataIntelligence on connection [connectionId{localValue:7, serverValue:249520}] to server

output
{
"tags" => [
[0] "_grokparsefailure"
],
"message" => "2018-09-20 10:35:36 917 [DEBUG] from org.mongodb.driver.protocol.command in application-akka.actor.default-dispatcher-383 - Sending command {update : BsonString{value='accounts'}} to database dataIntelligence on connection [connectionId{localValue:7, serverValue:249520}] to server",
"host" => "node1",
"@timestamp" => 2018-09-20T12:51:47.217Z,
"@version" => "1"
}

grok filter

input { stdin { } }

filter {
  grok {
    match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} (\[%{WORD:loglevel}\]) %{DATA}" }
  }
  date {
    match => [ "timestamp" , "ISO8601" , "yyyy-MM-dd HH:mm:ss" ]
  }
}
output {
  stdout { codec => rubydebug }
}

Since your grok filter is failing no timestamp field is being created.

It appears you have a space between the seconds and milliseconds (or whatever "917" is) and TIMESTAMP_ISO8601 doesn't match that.

917 is separate field , timestamp is not failing i checked in grok debug

917 is separate field

Okay, but you're not including it in your grok expression. According to the expression the loglevel comes immediately after the timestamp but that's obviously not true.

you mean to say if there are any grok failures , the date filter will not work ?

If grok fails it won't extract the field that the date filter needs to parse so the date filter will obviously also fail.

thanks for reply, below grok filter is working but when i replace [ ] with { } it's not working

match => [ "message", "%{TIMESTAMP_ISO8601:timestamp} (\[%{WORD:loglevel}\]) %{DATA} - %{DATA:method} processing time for transactionId : %{WORD:transactionid} documentType : %{WORD:document type} is %{INT:duration:int}" ]

below one is not working

> match => { "message", "%{TIMESTAMP_ISO8601:timestamp} (\[%{WORD:loglevel}\]) %{DATA} - %{DATA:method} processing time for transactionId : %{WORD:transactionid} documentType : %{WORD:document type} is %{INT:duration:int}" }

That's right. You need to use [ ... , ....] or { ... => ... }. { ... , .... } won't work.

@magnusbaeck thank you, it's working for me, below is JSON output, some extra character added to the @timestamp field T & 000Z , how to remove those ?

{
   "@timestamp" => 2018-09-20T12:19:42.000Z,
     "@version" => "1",
    "timestamp" => "2018-09-20 12:19:42",
       "method" => "Transaction Data encryption and insertion",
         "host" => "node1",
     "document" => "License",
      "message" => "2018-09-20 12:19:42 [INFO] from application in pool-3-thread-5 - Transaction Data encryption and insertion processing time for transactionId : 6064824338348622_node1 documentType : License is 5 msec",
"transactionid" => "6064824338348622_node1",
     "duration" => 5,
     "loglevel" => "INFO"

}

There's no easy way of doing that. Just let it be.

how to parse time stamp , even though if there grokparse failure ? bcoz my all log messages are not same, below is example

"@timestamp" => 2018-09-21T06:44:31.018Z,
  "@version" => "1",
      "tags" => [
    [0] "_grokparsefailure"
],
      "host" => "node1",
   "message" => "2018-09-20 12:19:42 [INFO] from application in pool-3-thread-5 - Authenticate DQL processing time for transactionId : 6064824338348622_node1 documentType : License merchant : 70214f84- is 376 msec"

A single grok filter can list multiple expressions (see the description of the match option in the grok filter documentation for details). After the more specific expressions you currently have, list a generic one that only extracts the minimum like the timestamp, the loglevel, and the message itself.

understood, we can create multiple expressions in grok filter, if the log message matches at-least one expression , then @timestamp will work

getting error in logstash logs

[2018-09-21T09:18:11,572][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-2018.09.21", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x2346414e>], :response=>{"index"=>{"_index"=>"filebeat-2018.09.21", "_type"=>"doc", "_id"=>"JpNs-2UB9EmukO5GHv4N", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [timestamp]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"2018-09-21 09:17:14,137\" is malformed at \" 09:17:14,137\""}}}}}

Why not just delete timestamp after parsing it into @timestamp? Then that problem will disappear.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.