Timestamp issue on storing logs


(KMG) #1

I don't know why logstash creating timestamp with 1hr lag approx. Initially I thought , problem might be on Elasticsearch.

but after I did some analysis, by just redirect logstash out to stdout file. then I saw "wrong timestamp registered by logstash". I don't know how its logging that timestamp even I haven't mentioned anything in my configuration file to use logs timestamp field.

this is my logstash configuration:

input {

  kafka { 
        bootstrap_servers =>["localhost:9092"]
        topics => ["app"]
        codec => "json"
  }
}

filter {
	geoip {
		source => "ip"
	}
}


output {
          elasticsearch {

                    hosts => ["10.11.12.169:9200"]
                    index => "app-%{+YYYY.MM.dd}"
                 }
}

Server date:

[root@srv1 conf.d]# date
Fri Jul 28 14:27:59 GMT 2017
[root@srv1 conf.d]#

logstash stdout log:

"start_time": "2017-07-28T13:22:00.049Z",
"@timestamp": "2017-07-28T13:22:40.000Z",
"flow_id": "kAD/////AP//CP////8AAAE0kChFRUEnLLre",
"last_time": "2017-07-28T13:22:00.049Z",


(KMG) #2

my source is from packetbeat and push the data into kafka. And Using logstash to feccth logs from Kafka.

I checked the kafka stored message using consumer script. data had a timestamp field with correct value. But logs stored on elasticsearch has different timestamp. So I suspect something wrong with logstash.

yeah, its rights ,timestamp fields is getting changed @logstash side. I don't know why its changing the timestamp field, when server [ where logstash is running ] is in correct timestamp.

Can someone please help me on this. as my production system entire got affected due to this time mismatch issue


(system) #3

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.