Time Field issue

Hello Team,

I have write down a pattern for my log and its parsing properly from it. Below is the sample log:

I, [2018-11-26T06:30:02.281210 #17562]  INFO -- : [0b94e730-7bf1-435d-9f4b-8dd17a0769c1] Current device: 176642 : Zonnebloem 2 LN

And pattern for same is:

\w\,\s\[(?<date-time>[\w\-\:\.]+)\s\#(?<pid>[\d]+)\]\s\s(?<loglevel>[\w]+)\s\--\s\:\s\[(?<request-id>[\d\-\w]+)\]\s(?:[cC]urrent\s)?[dD]evice[\s:]+(?<device-id>[\w\s\:]+)

But when i am checking on the kibana dashboard all the fields are showing correctly. Only date-time filed is not showing correct value.

It showing the IST time zone value. When i receive this log at kibana dashboard. Please refer the below screenshot for same:

Can you please help me to know the reason of this strange behavior?

Thank you.

All timestamps in Elasticsearch have to be in UTC timezone. When Kibana shows timestamps it changes them to the local timezone, which explains the offset. It does however not change or later the actual source documents, which is why you still see the UTC timestamp there.

@Christian_Dahlqvist, Thank you for your response.

Is there any way to see the actual time and date which we receive in logs with a field name as i created date-time?

Thanks.

In Kibana, go to Management > Advanced Settings > Timezone for date formatting and see if changing it from Browser to your timezone helps?

@Eniqmatic, Thanks for your response.

It will change the @timestamp filed value. Which is my local time zone. So i don't want to change it. Please read my first post. I have log and i have write regex pattern for same. In my log i have date and time so i created a field date-time and during parsing at test time it passing this field properly. But at kibana dashboard this filed also showing the UTC value not the actual value which is present in my log.

Thanks.

Ah sorry, are you using the date filter?

Below is my log pattern:

I, [2018-11-26T06:30:02.281210 #17562]  INFO -- : [0b94e730-7bf1-435d-9f4b-8dd17a0769c1] Current device: 176642 : Zonnebloem 2 LN

My regex pattern for above log:

\w\,\s\[(?<date-time>[\w\-\:\.]+)\s\#(?<pid>[\d]+)\]\s\s(?<loglevel>[\w]+)\s\--\s\:\s\[(?<request-id>[\d\-\w]+)\]\s(?:[cC]urrent\s)?[dD]evice[\s:]+(?<device-id>[\w\s\:]+)

Thanks.

That's not what I meant, are you using the date filter to create the field? Post your full "filter" config please.

Below is my filter:

filter {
if [type] == "application_log" {
grok {
match => { "message" => [ "\w\,\s\[(?<date-time>[\d\-\w\:\.]+)\s\#(?<pid>\d+)\]\s+(?<loglevel>\w+)\s\-+\s\:\s\[(?<request-id>[\d\w\-]+)\]\s(?<method>[\w\s]+)\s\"(?<path>[\w\/\.]+)\"\s(?<mlp-message>.*)", "\w\,\s\[(?<date-time>[\d\-\w\:\.]+)\s\#(?<pid>[\d]+)\]\s\s(?<loglevel>[\w]+)\s\--\s\:\s\[(?<request-id>[\d\-\w]+)\]\s(?:[cC]urrent\s)?[dD]evice[\s:]+(?<device-id>[\w\s\:]+)", "\w\,\s\[(?<date-time>[\d\-\w\:\.]+)\s\#(?<pid>\d+)\]\s+(?<loglevel>\w+)\s\-+\s\:\s\[(?<request-id>[\d\w\-]+)\]\s(?<mlp-message>.*)", "\w\,\s\[(?<date-time>[\w\-\:\.]+)\s\#(?<pid>[\d]+)\]\s\s(?<loglevel>[\w]+)\s\-+\s\:\s\[?[sS]idekiq::Extensions::DelayedClass\s(?<request-d>[\w]+)\]\s(?<mlp-message>.*)", "\w\,\s\[(?<date-time>[\w\-\:\.]+)\s\#(?<pid>\d+)\]\s+(?<loglevel>\w+)\s(?<mlp-message>.*)" ] }
}
}
else if [type] == "syslog_logs" {
grok {
match => { "message" => ["%{SYSLOGTIMESTAMP:[system][syslog][timestamp]} %{SYSLOGHOST:[system][syslog][hostname]} %{DATA:[system][syslog][program]}(?:\[%{POSINT:[system][syslog][pid]}\])?: %{GREEDYMULTILINE:[system][syslog][message]}"] }
      pattern_definitions => { "GREEDYMULTILINE" => "(.|\n)*" }
}
date {
      match => [ "[system][syslog][timestamp]", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
   }
}
else if [type] == "auth_logs" { 
grok {
match => { "message" => ["%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:\[%{POSINT:[system][auth][pid]}\])?: %{DATA:[system][auth][ssh][event]} %{DATA:[system][auth][ssh][method]} for (invalid user )?%{DATA:[system][auth][user]} from %{IPORHOST:[system][auth][ssh][ip]} port %{NUMBER:[system][auth][ssh][port]} ssh2(: %{GREEDYDATA:[system][auth][ssh][signature]})?", "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:\[%{POSINT:[system][auth][pid]}\])?: %{DATA:[system][auth][ssh][event]} user %{DATA:[system][auth][user]} from %{IPORHOST:[system][auth][ssh][ip]}", "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:\[%{POSINT:[system][auth][pid]}\])?: Did not receive identification string from %{IPORHOST:[system][auth][ssh][dropped_ip]}", "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sudo(?:\[%{POSINT:[system][auth][pid]}\])?: \s*%{DATA:[system][auth][user]} :( %{DATA:[system][auth][sudo][error]} ;)? TTY=%{DATA:[system][auth][sudo][tty]} ; PWD=%{DATA:[system][auth][sudo][pwd]} ; USER=%{DATA:[system][auth][sudo][user]} ; COMMAND=%{GREEDYDATA:[system][auth][sudo][command]}", "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} groupadd(?:\[%{POSINT:[system][auth][pid]}\])?: new group: name=%{DATA:system.auth.groupadd.name}, GID=%{NUMBER:system.auth.groupadd.gid}", "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} useradd(?:\[%{POSINT:[system][auth][pid]}\])?: new user: name=%{DATA:[system][auth][user][add][name]}, UID=%{NUMBER:[system][auth][user][add][uid]}, GID=%{NUMBER:[system][auth][user][add][gid]}, home=%{DATA:[system][auth][user][add][home]}, shell=%{DATA:[system][auth][user][add][shell]}$", "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} %{DATA:[system][auth][program]}(?:\[%{POSINT:[system][auth][pid]}\])?: (?<system.auth.ssh.event>Accepted) (?<system.auth.ssh.method>publickey) \w+ (?<username>.*)", "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} %{DATA:[system][auth][program]}(?:\[%{POSINT:[system][auth][pid]}\])?: %{GREEDYMULTILINE:[system][auth][message]}"] }
pattern_definitions => {
        "GREEDYMULTILINE"=> "(.|\n)*"
      }
}
   geoip {
      source => "[system][auth][ssh][ip]"
      target => "[system][auth][ssh][geoip]"
   }

}
else if [type] == "nginx_access" {
grok {
match => { "message" => ["%{IPORHOST:[nginx][access][remote_ip]} - %{DATA:[nginx][access][user_name]} \[%{HTTPDATE:[nginx][access][time]}\] \"%{WORD:[nginx][access][method]} %{DATA:[nginx][access][url]} HTTP/%{NUMBER:[nginx][access][http_version]}\" %{NUMBER:[nginx][access][response_code]} %{NUMBER:[nginx][access][body_sent][bytes]} \"%{DATA:[nginx][access][referrer]}\" \"%{DATA:[nginx][access][agent]}\" %{NUMBER:[nginx][response][time]}"] }
}
mutate {
      add_field => { "read_timestamp" => "%{@timestamp}" }
   }
   useragent {
      source => "[nginx][access][agent]"
      target => "[nginx][access][user_agent]"
      remove_field => "[nginx][access][agent]"
   }
   geoip {
      source => "[nginx][access][remote_ip]"
      target => "[nginx][access][geoip]"
   }
}
else {
grok {
match => { "message" => [ "(?<date-time>[\w\s\d\:]+)\s(?<IP>192.168.50.1)\s(?<port>582)\:\s(?<message>.*)" ] }
}
}
}

Please help me to fix the issue.

Thanks.

The example you showed is an application_log for which it seems you do not apply any date filter. The pattern you are extracted seems to be recognised as a timestamp by Elasticsearch, and it is assumed this is in UTC as that is the requirement.

I agree with @Christian_Dahlqvist

You could use the date filter to set the date-time field and set the timezone also. I'm unsure if this will make Kibana show it correctly however

@Christian_Dahlqvist,

Means i need to add date filter in my application_log filter? Which will be look like below:

date {
      match => [ "(date-time)", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
   }

is my above date filter is right? If no please help me to write it.

Thanks.

Yes but set the target to "date-time" and also use the timezone setting too.

I have set the date filter as below:

date {
      match => [ "date-time", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
   }

But still getting the time in UTC not that in log.

Can you please help me how we can set time zone in date filter? But i have multiple application server which are exist in different-2 timezone. Can you please give me the complete date filter example?

Thanks.

Try this:

date {
      match => [ "date-time", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
      timezone => "Europe/London"
      target => "date-time"
   }

Obviously set the timezone to your timezone.

i have set the below date filter:

date {
      match => [ "date-time", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
      timezone => "Etc/UTC"
      target => "date-time"
   }

My server is in Etc/UTC timezone. Still i am getting the time according to UTC timezone in my date-time field not as per my log.

So you have set the timezone to UTC and your getting the time in UTC? What would you expect?

I want 2018-11-26T06:30:02.281210 value from below log in my date-time filed. Without using date filter, also getting UTC time in my date-time field. But the time in log and time in date-time are not matching with each other.

I, [2018-11-26T06:30:02.281210 #17562]  INFO -- : [0b94e730-7bf1-435d-9f4b-8dd17a0769c1] Current device: 176642 : Zonnebloem 2 LN

Yes but in the date filter above, you have set the timezone to be UTC?

Yes, i have set timezone UTC but time in log and getting time in date-time are not matching with each other. Its seems that in log i am getting the time according to system date and time i.e UTC.

But in date-time field its different. Please refer the below screenshot:

Thanks.