Replacing @timestamp with logs having custom timestamp

Hello Everyone,
I am newbie in ELK stack and i am still learning the logstash, kibana and its further uses . Currently i am stuck at a point where i want to extract the time stamp from my logs and replace it with @timestamp of the kibana dashboard . I have seen there are various other people who faced the same issue but none of their solution seem to be working for me .

Here is the grok filter that i am using for my logs

filter {
  if [fields][Component] == "Campaign Director" {
    grok {
      match => {
        'message' => '(?<LogLevel>[%{WORD}]+)([-])?(?<Category>[%{WORD}]+)? (?<TimeStamp>%{MONTHDAY:Day} %{MONTH:Month} %{YEAR:Year} %{HOUR}:%{MINUTE}:%{SECOND}\.%{WORD:Milliseconds}) \[%{DATA:ThreadName}\|%{DATA:ClassName}\.%{DATA:FunctionName}(:%{NUMBER:LineNumber})?] *- %{GREEDYDATA:LogMessage}'
      }
    }
    date {
      match => ["timestamp", "dd MMM yyyy HH:mm:ss.SSS"]
      target=>@timestamp
         }
    mutate {
      add_field => {
        Component => "%{[fields][Component]}"
      }
      gsub => ["Category", "TSK", "TASKS"]
      gsub => ["Category", "RST", "REST"]
      gsub => ["Category", "ZNE", "ZONES"]
      gsub => ["Category", "DSH", "DASHBOARD"]
      gsub => ["Category", "HST", "HISTORY"]
      gsub => ["Category", "SCD", "SCHEDULES"]
      gsub => ["Category", "IMP", "IMPORT"]
      gsub => ["Category", "CLP", "CLEANUP"]
      gsub => ["Category", "TSK", "TASK"]
      gsub => ["Category", "ENTEXT", "ENTRYEXIT"]
      gsub => ["Category", "IMPVRB", "IMPORTVERBOSE"]
      gsub => ["LogLevel", "FST", "FINEST"]
      gsub => ["LogLevel", "FNR", "FINER"]
      gsub => ["LogLevel", "FNE", "FINE"]
      gsub => ["LogLevel", "IFO", "INFO"]
      gsub => ["LogLevel", "WRN", "WARN"]
      gsub => ["LogLevel", "FTL", "FATAL"]
      gsub => ["LogLevel", "ERR", "ERROR"]
      gsub => ["Component", "CmpDir", "Campaign Director"]

 	}     
    }
    
  } 


This is my sample log
FST-SCD 11 Dec 2022 07:39:50.527 [Cleanup-Thread|CleanupThread.cleanDanglingSchedules:] - CleanDanglingSchedules - inside for loop index=1, TriggerName=Trigger1, TriggerState=NORMAL

Currently it is showing the current date on the @timestamp field and not the log date
Any help would be appreciated . Thanks Everyone

Grok pattern is OK, parsing is correct.
The fields TimeStamp and timestamp are different, the naming is case sensitive.

      date {
        match => ["TimeStamp", "dd MMM yyyy HH:mm:ss.SSS"]
        # timezone => "Europe/Berlin"
        # target=>"@timestamp" No need, it's default
      }

Result:
"@timestamp" => 2022-12-11T06:39:50.527Z

Optionally you can set your time zone. Default TZ is from the LS host. It's useful in cloud environments.

Thanks for the reply, I have tried that using TimeStamp then it does not display anything on the kibana dashboard at all.

TimeStamp is a string.
If you want, use target=>"@TimeStamp" for but TimeStamp as date. You have to recreate index pattern or just leave @timestamp as the Timestamp field in Kibana.

But @timestamp is not showing the correct date and time if i leave it as it is.

Also if i use target=> @TimeStamp will it convert it to date ?

Perhaps that is because the time range on the dashboard does not extend far back enough to show early December. That may indicate it is actually working.

1 Like

As Badger said, send the screen from Kibana where is data is not OK and how the message looks like.

1 Like

OHHHH....now i get it, so dumb of me to not check that before hand. Thanks @Rios and @Badger for helping i got what i wanted.