UTC vs EST log events in Kibana

Hi there. I was not able to find a solution for my problem that is why I am creating this post. My problem is that we are sending logs to elasticsearch from linux nodes using either UTC or EST as their timezones. For EST nodes logs in Kibana are being displayed according to the time, but for linux nodes using UTC timezone I need to set my time for FUTURE(Next 5 hours) in Kibana in order to see the latest logs. My logstash filer is the following, but I am not even sure if Logstash is the one that needs to be fixed for this.

root@centralizedlogging:/etc/logstash/conf.d# cat 10-syslog-filter.conf
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}

You would have to change that to

if (some condition) {
    date {
        match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
    }
} else {
    date {
        match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
        timezone => "US/Eastern"
    }
}

You will need to write a conditional that tests some field or fields on the event to see if they are UTC and sends them through the first branch.

Thank you for quick response. I think I like this approach, what CONDITION should I be checking on if my entire filter right now looks like this? Should I be checking in my CONDITION if logs are coming from UTC then ..... else timezone => "US/Eastern"? Thank you in advance.

filter {
  if [type] == "syslog" {
    grok {
      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} % {SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
    }
      syslog_pri { }
      date {
          match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
      }
   }
}

Hard to say. You say some hosts are UTC and some are US/Eastern. You need some test to tell you what timezone a given event is in. It would be possible to test the hostname that sent it, but depending on the rest of your configuration there may be other ways. For example, if the events are being shipped by filebeat you may be able to have filebeat tag them at source. Again, if you are using beats you might be able to send all of the UTC events to one port and all of the US/Eastern events to another, then tag them with the input. There are any number of possibilities.

Thank you very much. I do use filebeat to send logs from our linux nodes. Now, that I know the source of my issue I can start fixing it. Thank you again @Badger.

Best, Yuri.

Thank you for your help @Badger. I was able to add field inside filebeat configuration and use that field for condition. It is working as charm now.

#Filebeat config
fields:
  timezone: EST

#Logstash config
filter {
  if [type] == "syslog" {
    grok {
      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
if [fields][timezone] == "EST" {
  date {
    match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
    timezone => "US/Eastern"
  }
} else {
  date {
    match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
    timezone => "UTC"
  }
}
}
}