Update @timestamp as per log data

My log looks like below

Apr 27 10:35:44.001204 xxxClient(106809) NOTICE: {970020} ABC CLIENT_MIN_LATENCY in the last period for host abc4.example.com was 4906

Can you please help me to know how do I update event's @timestamp with log data?

I am using this. Logstash - v5.3.1.

Logstash.config is as below

input {
beats {
port => 5044
}
}

filter {
grok {
match => { "message" => "%{MONTH} %{MONTHDAY} %{TIME} %{WORD:ProcessName}(%{INT:ProcessId}) %{WORD:Status}: {%{INT:ProcessId}} %{WORD:SubProcessName} %{WORD:LatencyDetails} \in the last period for host %{GREEDYDATA:DgwHost} \was %{INT:Latency}" }
}

date {
#locale => "en"
#timezone => "America/New_York"
#match => [ "@timestamp", "MMM dd HH:mm:ss.SSS" ]
#add_tag => [ "tsmatch" ]

  #match => [ "event_timestamp" , "MMM dd HH:mm:ss.SSS" , "ISO8601" ]
  match => [ "event_timestamp" , "YYYY-MM-dd HH:mm:ss" ]
  #target => "@timestamp"
  #add_tag => [ "tmatch" ]
}

}

output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}

Why reinvent the wheel when Logstash ships with patterns for syslog files? See the examples section in the Logstash documentation.

Thanks Magnus for your response. Actually I tried various options but my end goal to assign date from Log files to Logstash Timestap didn't work.

Can you please help me for the above log line?

Looking more closely I see that your log isn't quite in syslog format so using a custom grok filter makes sense. But you're trying to parse the event_timestamp field which doesn't exist. In fact, you're not extracting any fields from the timestamp. This should give you an event_timestamp field that you can parse:

^(?<event_timestamp>%{MONTH} %{MONTHDAY} %{TIME}) %{WORD:ProcessName} ...

I strongly suggest that while you're debugging your filters use a stdout { codec => rubydebug } output instead of your current elasticsearch output.

I think I could make it partially work. I.e "timestamp" is replaced partially. Everything except "hour" is replaced properly. Please let me know if I am doing something wrong.

Below is my logstash config.

filter {
  grok {
    match => { "message" => "%{SYSLOGTIMESTAMP:timestamp} %{WORD:ProcessName}\(%{INT:ProcessId}\) %{WORD:Status}\: \{%{INT:ProcessId}\} %{WORD:SubProcessName} %{WORD:LatencyDetails} \in the last period for host %{GREEDYDATA:DgwHost} \was %{INT:Latency}" }
  }

  mutate {
    convert => { "Latency" => "integer" }
  }

  date {
    match => [ "timestamp" , "MMM dd HH:mm:ss.SSSSSS" ]
  }
}

And my output is as below.

"message" : "Mar 29 11:56:15.410045 diameterBeClient(16044) NOTICE: {970020} DCD DIAMETER_MIN_LATENCY in the last period for host dgw1.example.com was 5845"
"@timestamp" : "2017-03-29T15:56:15.410Z"
"timestamp" : "Mar 29 11:56:15.410045"

What's the timezone of the machine where Logstash runs? If it's UTC-4 then things are working just fine. @timestamp is always UTC.

Oh that explains it. My Logstash is in ET "Eastern Daylight Time" timezone.

Is there any possibility to set @timestamp with exact value of Log date irrespective of what timezone my Application server is running?

As I said @timestamp is always UTC.

Got it now. Thanks Magnus for all your help.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.