Logstash KV plugin, convert string values to timestamp

(shashwat) #1


I am using Logstash 5.5 to parse my logs. I have following format of the logs:

time taken for doing transfer once for all files in seconds=23 transfer start time= 201708030959 transfer end time = 201708030959.

I am using KV plugin to get the key/value pair from this file. I want to convert received "time" key's value "201708030959" to actual timestamp. I am using following configuration:
filter {
kv {
allow_duplicate_values => false
trim_key => "\s"
value_split => "="

date {
match => ["time","YYYYMMddHHmm"]


But , it does not convert "time" to date/timestamp when I check in Kibana. It keeps it as String. Please let me know how can I convert this string time value to actual timestamp?


(Magnus Bäck) #2

Your date filter doesn't convert the time field. It parses that field and stores the resulting timestamp in the @timestamp field. Secondly, even if you configure the date filter to store the result in the time field that field will already have been mapped as a string so you need to reindex or create a new index for it to be mapped as a timestamp.

(shashwat) #3

Hi Magnus,

Thanks for replying. If I am getting it correctly, should I refer @timestamp field in my date filter like :
date {
match => ["@timestamp","YYYYMMddHHmm"]

OR, something like this

date {
match => ["time","YYYYMMddHHmm"]
target => "@timestamp"
remove_field => ["time"]


(Magnus Bäck) #4

The latter.

(shashwat) #5

Hi Magnus,

Thanks, that worked. I can see the @timestamp mapped to "time" field in Kibana. Only difference is between the time written in the logs and displaying on Kibana GUI. e.g

time taken for doing transfer once for all files in seconds=30 transfer start time= 201708031000 transfer end time = 201708031000

But on Kibana , it is shown as :

@timestamp August 3rd 2017, 20:30:00.000

So, date is correct but time is not the same. Is it due to my local timzone in which my browser is running? Can I make any changes to configuration to keep this time in sync?


(Magnus Bäck) #6

The @timestamp field is always UTC but Kibana does by default adapt it to the browser's local timezone. What timezone is 201708031000 in? What's the raw and inadjusted @timestamp value (use Kibana's JSON tab)?

(shashwat) #7

Hi Magnus,

It is CDT (Central day light time) and it is 10:30 hours behind my current location (New Delhi). I think, it is the same difference I see in Kibana.

Following is Kibana's JSON tab:

"_index": "logs_output",
"_type": "logs_output",
"_id": "AV5hVygPhhnM7015uVjd",
"_version": 1,
"_score": null,
"_source": {
"path": "/var/opt/logs/ELK/applog_output.log",
"seconds": "23",
"@timestamp": "2017-08-03T14:59:00.000Z",
"@version": "1",
"host": "serv231-mgmt",
"message": "time taken for doing transfer once for all files in seconds=23 transfer start time= 201708030959 transfer end time = 201708030959"
"fields": {
"@timestamp": [
"sort": [


(Magnus Bäck) #8

If 201708031000 is CDT and gets turned into 2017-08-03T15:00:00.000Z in @timestamp which is then displayed as Aug 3, 2017 at 20:30 local time for you then everything is working as expected.

(shashwat) #9

Thanks Magnus helping me out.

(shashwat) #10

Hi Magnus,

Is there a way in logstash where I can have more than one @timestamp attribute mapped with corresponding "time" keys in the logs? e.g. I have two different "time" keys in a log line, time="value1" and time="value2". For the same value (value1=value2) it works fine but for different time values above configuration does not work correctly.


(Magnus Bäck) #11

You can have any number of timestamp fields. The date filter has a target option that allows you to choose where the parsed timestamp is stored.

(shashwat) #12

Thanks Magnus. I am able to solve it now. Thanks for the guidance.

(system) #13

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.