Hello there,
i want to visualize a lot of old csv files.
The files contains a lot of data each.
Structure:
Time;DataSet1;[...];DataSet48
0.009618018;[...];0
0.019466060;[...];1
[...]
720.946525393;[...];0.012807
The Time-base are seconds. Beginning at the moment the file was created.
My logstash.conf looks like this:
filter {
csv {
separator => ";"
columns => ["Time","DataSet1",[...],[DataSet48]]
convert => {"Time" => "float"}
convert => {"DataSet1" => "float"}
[...]
convert => {"DataSet48" => "float"}
}
date{
match => ["Time", "s.SSSSSSSSS"]
target => "timestamp"
}
}
output{
elasticsearch {
action => "index"
hosts => "localhost"
index => "cv"
workers => 1
}
stdout { codec => rubydebug }
}
Everything is loaded without error. So far so good.
My Problem:
There is a huge mismatch in the timestamp. I got 6.000 entries on Jan. 1st 2017 from 00:00 o'clock to 00:01 o'clock. I can't find any data entry with a "Time" older than 59.999999999.
Is it possible to convert my "Time" in something that Kibana can visualize with all of my data?
Greetings S3