Mapping timestamp in Kibana

Hi, I have problem mapping timestamp field. I have field in my csv file timestamp that have values like 1545003901, 1543347920 etc... In kibana in dev tools i type
"properties":{
"timestamp":{
"type" : "date"
"format": "epoch_millis"
}
}
But it is giving me year 1970. I know that this epoch have something to do with the year of 1970, but is there any way to format these type of timestamps? How can I have the correct date of the timestamp? Should I convert it somehow when I ingest it in logstash?

You may want to use epoch_second instead of epoch_millis, as your data looks like it is missing the millisecond granularity.

Also tried that, but it says failed to parse date field with format epoch_second

This is the logstash conf file

and this is the mapping that I do in kibana

Its says could not parse field time stamp with this format [epoch_second]

And this is how my timestamp field looks like

Please do not post screenshots, but real code snippets.

Can you create a fully reproducible minimal example, so others can follow this issue and retry it on their own? Otherwise helping will be hard without any further information like error messages.

Sure,

Conf file:

input{
stdin{}
}
filter{
csv{
separator=>","
columns=>['temp', 'location', 'clouds', 'pressure', 'rain', 'time_stamp', 'humidity', 'wind']
}
}
output{
elasticsearch{
hosts=>["localhost:9200"]
index=>"weathertest"
}
stdout{codec => rubydebug}
}

Input data:
42.42, Back Bay, 1, 1012.14, 0.1228, 1545003901, 0.77, 11.25

Mapping dev tools kibana:

PUT weathertest/_mappings
{
"properties": {
"temp": {
"type": "float"
},
"clouds":{
"type": "float"
},
"pressure":{
"type": "float"
},
"rain": {
"type": "float"
},
"time_stamp": {
"type": "date",
"format": "epoch_second"
},
"humidity": {
"type": "float"
},
"wind": {
"type": "float"
}
}

}

Here is minimal example.
Just to recap, the problem is the time_stamp field, which I want to convert do date, when I try with epoch_millis it creates with year 1970, when I use epoch_second it cant parse the field.

Please use proper formatting, when copying text snippets, this is really hard to read. This forum supports markdown, so formatting code snippets is quite easy.

I think the problem is with your data. When using the csv filter, you data gets split and the data in question will become 1545003901- but with a space in the beginning. This is not part of your mapping configuration for the date and thus throws an error. I am pretty sure that this shows up in your logstash output.

The solution to this is to get rid of the space in the beginning. You can try and split by , with a space at the end and see if that works, or you will have to trim your data.

Worked thanks :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.