Winlogbeat does not have a proper timestamp field

Winlogbeat doesnot have a Proper timestamp filed,
Please help me with this issue.

thanks
pramod

All Winlogbeat events have a @timestamp field that contains the time that the Windows event log record was originally created.

Yes, it has the @timestamp fileld. But this timestamp filed has the time when the event has been parsed by logstash rather than the time when it has been logged(created) as windows event.

Can you please help me to resolve this issue ??

Thanks
Pramod

Please share the Logstash configuration and version you are using.

Hi andrewroh,

Winlogbeat version: 1.2.3, Logstash version: 2.3.3, Elasticsearch:
2.3.3 and Kibana: 4.5.1.

# The # character at the beginning of a line indicates a comment. Use
# comments to describe your configuration.
input {
    beats {
        port => 5044
    }
}

# The filter part of this file is commented out to indicate that it is
# optional.
filter {
mutate {
gsub => [
"meassage", "\r\n", " ",
"message", "\n", " ",
"message", "\t", " "
]
}


if [message] =~ /PFError/ {
grok {
match => [ "message","%{GREEDYDATA} <ErrorCode>%{DATA:HRESULT}</ErrorCode> <Trial>%{DATA:Trial}</Trial> <Details>%{DATA:Detail}</Details> %{GREEDYDATA}" ]
}
}
if [message] =~ /HRESULT/ { 
grok {
match => [ "message","%{GREEDYDATA} HRESULT=%{DATA:HRESULT}; Trial=%{DATA:Trial}; Details=%{DATA:Detail}[.;] %{GREEDYDATA}" ]
}
}
if [message] =~ /WINNT/ {
grok {
match => [ "message","%{GREEDYDATA} Exception code: %{DATA:HRESULT} %{GREEDYDATA}" ]
}
mutate{
          add_field=>{
            "Detail"=>"Unhandled access exception"
            "Trial"=>" "
          }
}
}
if [message] =~ /Error| Critical/ {
  mutate {
  add_tag => [ "Error_tag" ]  
}
}

if "MiddleTier" in [tags]{
        mutate {
          add_field => {
              "gbu_name" => "%{tags[0]}"
              "prod_name" => "%{tags[1]}"
              "version" => "%{tags[2]}"
              "env_type" => "%{tags[3]}"
              "tier" => "%{tags[4]}"
              "tenantName" => "%{tags[5]}"
              }
          lowercase => [ "gbu_name" ]
          lowercase => [ "prod_name" ]
          lowercase => [ "env_type" ]
          lowercase => [ "tier" ]
          lowercase => [ "tenantName" ]
          join => { "gbu_name" => "" }
          join => { "prod_name" => "" }
          join => { "version" => "" }
          join => { "env_type" => "" }
          join => { "tier" => "" }
          join => { "tenantName" => "" }
          }
      }
 
}

output{
 stdout{ 
      codec => dots
    }
  if "_grokparsefailure" not in [tags]{
    if "MiddleTier" in [tags]{
      elasticsearch{
      hosts=>["localhost:9200"]
        index=> "wl-logdetails"        
      }
      if "Error_tag" in [tags]{
      elasticsearch{
      hosts=>["localhost:9200"]
        index=> "wl-logsummary"        
        }
      }
     }
    }
  }

Thanks
Pramod

I don't see anything obvious in your Logstash configuration that would be modifying the @timestamp field coming from Beats. I suggest that you do a test without any filters and see what the @timestamp is. It should definitely be the time that the event log record was originally created.

input {
  beats {
    port  => 5044
  }
}

output {
  stdout { codec => rubydebug { metadata => true } } 
}

Then after doing this test, start adding back in filters one-by-one to see where the @timestamp mutation is occurring.

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.