Discrepancy in logstash timestamp output and kibana

Hello,

I have a csv with line like following

"Broker Name","Flow Name","Record Start Date","Record Start Time","Record GMT Start Timestamp","Record End Date","Record End Time","Record GMT End Timestamp","Average CPU Time","Total Number of Input Messages"
"DEV1","SAPServiceFlow","2017-02-27","00:55:32.844021","2017-02-27 06:55:32.8440","2017-02-27","01:46:8.425477","2017-02-27 07:46:8.42547","137110","3035303751"

I am using my logstash conf file to get the output. But i am having issues with the timestamp between logstash output and displaying the same in kibana.

Here's the date part of the code from my logstash conf.

add_field => {
"timestamp" => "%{Record End Date} %{Record End Time}"}
remove_field => ["Record End Date"]
remove_field => ["Record End Time"]
}
date{
match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss.SSSSSS" , "ISO8601" ]
timezone => "Etc/UTC"
remove_field => [ "timestamp" ]
}

Now, when i run the logstash conf file, i get the timestamp in the output in Central Standard Time, which is what the Record End Date and Record End Time indicate.

Here's part of my output reflecting timestamp.

"@timestamp" => 2017-02-27T01:46:8.425Z

However, when i feed this same output to kibana, the timestamp is off. basically it's picking the GMT time. This is messing up with my filter searches in kibana.

What's wrong with my conf file that I get the desired output from logstash but the same output messes up in kibana?

If I understand you correctly, then the timestamp indicated by

is in

Is that correct?

Logstash uses Zulu time notation for ISO8601, which means that the Z at the end indicates the time of the record is UTC. If timestamp is not in UTC, then you should change timezone => "Etc/UTC" to reflect the timezone of the actual record. Logstash will make the Zulu notation reflect the actual time of the event.

Yes, Aaron. The timestamp in my output from logstash in Central Standard Time. Can you tell me how i can set this up in logstash so that the timestamp shows in Central Standard Time in both logstash output and kibana, please?
I tried timezone => "America/Chicago" and some other options but looks like they don't work.

Kibana auto-adjusts from UTC (which is how the date is stored in Elasticsearch) to your local timezone. The correct way is as you stated, with the timezone parameter, and your example should work timezone => "America/Chicago"

I tested 2017-02-27 00:55:32.844021 with:

input { stdin {} }

filter {
  date {
    match => [ "message", "yyyy-MM-dd HH:mm:ss.SSSSSS" ]
  }
}

output { stdout { codec => rubydebug } }

And got output:

2017-02-27 00:55:32.844021
{
    "@timestamp" => 2017-02-27T07:55:32.844Z,
      "@version" => "1",
          "host" => "Steiny-2.local",
       "message" => "2017-02-27 00:55:32.844021"
}

But I'm in Mountain time. So I added timezone => "America/Chicago":

2017-02-27 00:55:32.844021
{
    "@timestamp" => 2017-02-27T06:55:32.844Z,
      "@version" => "1",
          "host" => "Steiny-2.local",
       "message" => "2017-02-27 00:55:32.844021"
}

And you can see that the @timestamp changed from 2017-02-27T07:55:32.844Z to 2017-02-27T06:55:32.844Z

Thanks for the help, Aaron. I swear i am doing the same thing in my conf file. Don't know for what reason but it simply doesn't reflect the Central time. I checked and rechecked. This should be very simple. Not really sure what I am doing wrong.
I am adding End Date and End Time. It's doing that part without any error. In the date filter , i am changing it to UTC format. that should do it right?

Do not use timezone => "Etc/UTC" in the date filter. The date filter's true purpose is to convert to ISO8601. It will reflect this in Zulu time, which is UTC with a Z on the end. You don't have to specify the timezone as UTC. Just set it for what it is ("America/Chicago"), since there is no zone info in the timestamp field, as received.

This is my complete conf file.

input {
file {
path => "/tmp/stats/flowStats.csv"
start_position => "beginning"
}
}

   filter {
csv{
separator => ","
columns => ["Broker Name","Flow Name","Flow UUID","Record Start Date","Record Start Time","Record GMT Start Timestamp","Record End Date","Record End Time","Record GMT End Timestamp","Average CPU Time","Total Number of Input Messages"]
   }
mutate {
rename => { "Broker Name" => broker_name }
rename => { "Flow Name" => flowname }
rename => { "Total CPU Time" => cputime }
rename => { "Total Number of Input Messages" => input_messages }
remove_field => ["message","Flow UUID","Record Start Date","Record Start Time","Record GMT Start Timestamp","Record GMT End Timestamp"]
add_field => {
"timestamp" => "%{Record End Date} %{Record End Time}"
}
remove_field => ["Record End Date"]
remove_field => ["Record End Time"]
}
date{
match => ["timestamp","yyyy-MM-dd HH:mm:ss.SSSSSS"]
timezone => "America/Chicago"
}
mutate{
remove_field => [ "timestamp" ]
}
}
output {
stdout { codec => rubydebug }
file {
path => "/tmp/flwstats.csv"
}
}

I really don't know what i am doing wrong.

Do you have some sample output from rubydebug ?

Yes. And just this FYI, this is a sample csv

 "DEV1","ServiceNow","UID1","2017-02-27","13:46:15.653605","2017-02-27 19:46:15.6536","2017-02-27","14:46:18.429379","2017-02-27 20:46:18.4293","6023570","3011785"
"DEV1","ServiceNow","UID1","2017-02-27","14:46:18.429438","2017-02-27 20:46:18.4294","2017-02-27","15:46:15.259306","2017-02-27 21:46:15.2593","6254289","3127144"

Here's the sample output from rubydebug

{                                                                                     
              "input_messages" => "3011785",                                                                                  
                        "path" => "/tmp/stats/flowStats.csv",
                    "@version" => "1",                                                
                        "host" => "lsdebr1",
                 "broker_name" => "DEV1",                                          
                  "@timestamp" => 2017-02-27T20:46:18.429Z,                           
                    "flowname" => "ServiceNow",         
                     "cputime" => "6023570"                                            
}                                                                                     
{                                                                                     
              "input_messages" => "3127144",                                                                                
                        "path" => "/tmp/stats/flowStats.csv",
                    "@version" => "1",                                                
                        "host" => "lsdebr1",                                       
                 "broker_name" => "DEV1",                                          
                  "@timestamp" => 2017-02-27T21:46:15.259Z,                           
                    "flowname" => "ServiceNow",         
                     "cputime" => "6254289"                                            
}

Based on the values you provided:

If 14:46 is the accurate time in the America/Chicago timezone, then the resulting @timestamp is accurate. 14:46 is 6 hours behind UTC, so the UTC timestamp is 20:46.

The second line is also accurate.

So, this means that your date line is correctly converting the timestamp. What's the problem on the Kibana end, then? As mentioned, Elasticsearch should be storing all time values in Epoch time, which by definition is UTC. Kibana should be converting the times on the fly to reflect your time zone. When expanding the log lines, you will see UTC values for @timestamp, but in the graphs and charts, you will see the compensated timestamps.

Like i said when i started this topic, i was doing a Etc/UTC for timezone to get the timestamp in Central time in logstash output and when fed that to kibana, it was off by 6 hrs showing the GMT time instead.
Maybe Etc/UTC messed up the timing on Kibana too.
OK. Let me try feeding this to kibana and see how it shows. Thanks for all the help Aaron. You are a real superhero.

As a fun optimization:

You can put the remove_field inside the date filter block. It will only remove the field if the match and conversion are successful. That way you don't have to worry about the field being removed in spite of a mismatch, where in the current config, it will remove the field either way.

Try this instead:

  date{
    match => ["timestamp","yyyy-MM-dd HH:mm:ss.SSSSSS"]
    timezone => "America/Chicago"
    remove_field => [ "timestamp" ]
  }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.