Kibana wont recognize my time and date filed as a time and date

Hello,

I have a date and time field in my logs, see below first lines of field, I've omitted the other fields :
09/11/2017 17:06
09/11/2017 17:06
09/11/2017 17:06
09/11/2017 17:06
09/11/2017 17:07
09/11/2017 17:07
09/11/2017 17:09
09/11/2017 17:09
09/11/2017 17:10

When I upload my logs to Kibana it wont recognize this as time and date so I can't do any time analysis on my logs

input {
file {
path => "****"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","

    columns => ["timedate","connection_protocol","protocol","srcip","srcport","dstip","dstport","hostname"]

}
date{
match => ["timedate", "dd/MM/yyyy HH:mm:ss"]
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "timetesting"
}
stdout {}
}
~

Hi @I_like_dogs,

Can you provide me the mappings for the index?

curl -XGET <es_url>/timetesting/_mappings

Thanks,
Chris

{
"timetestplswork": {
"mappings": {
"logs": {
"properties": {
"@timestamp": {
"type": "date"
},
"@version": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"connection_protocol": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"dstip": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"dstport": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"host": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"message": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"path": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"protocol": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"srcip": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"srcport": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"tags": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"timestamp": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
}
}
}

FOR:

input {
file {
path => "******"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
#connection_type, connection_protocol, protocol, srcip, srcport, dstip, dstport, hostname
columns => ["timestamp","connection_protocol","protocol","srcip","srcport","dstip","dstport","hostname"]
}
date{
match => ["timestamp", "dd/MM/yyyy HH:mm:ss"]
timezone => "UTC"

}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "timetestplswork"
}
stdout {}
}

Try using the date datatype instead. Then, Kibana will understand that it is a date field and allow the appropriate aggregations.

Your timestamps do not have a seconds field, so this does not match. I see the events having a _dateparsefailure tag. Use dd/MM/yyyy HH:mm.

I copied and pasted the first lines off excel, my mistake. when you double click on the cells there is the seconds

Hi Chris, I'm new to elasticsearch/Kibana

Do I use this in the Dev tools?

Yup!

You'll want to create a new index and set the appropriate mappings.

Then you can try reindexing your existing index to this new index.

After that, create an index pattern in Kibana using the new index and see if you can do time analysis on your logs.

thank you!!! it worked :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.