Create a TimeStamp without having a real Date from a csv with a time-index

Hello there,
i want to visualize a lot of old csv files.
The files contains a lot of data each.

Structure:
Time;DataSet1;[...];DataSet48
0.009618018;[...];0
0.019466060;[...];1
[...]
720.946525393;[...];0.012807

The Time-base are seconds. Beginning at the moment the file was created.

My logstash.conf looks like this:

filter {
csv {
separator => ";"
columns => ["Time","DataSet1",[...],[DataSet48]]
convert => {"Time" => "float"}
convert => {"DataSet1" => "float"}
[...]
convert => {"DataSet48" => "float"}
}
date{
match => ["Time", "s.SSSSSSSSS"]
target => "timestamp"
}
}

output{
elasticsearch {
action => "index"
hosts => "localhost"
index => "cv"
workers => 1
}
stdout { codec => rubydebug }
}

Everything is loaded without error. So far so good.

My Problem:
There is a huge mismatch in the timestamp. I got 6.000 entries on Jan. 1st 2017 from 00:00 o'clock to 00:01 o'clock. I can't find any data entry with a "Time" older than 59.999999999.

Is it possible to convert my "Time" in something that Kibana can visualize with all of my data?

Greetings S3

What do your time values mean? What is the time relative to?

The values are seconds with a lot of decimal and start at zero when the file was created / the machine stared to do its stuff.

There's nothing built-in for this but I suppose you could use a ruby filter to check the creation time of the input file (the path to which can be found in the path field) and add that time to the first column.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.