Sorting behaviour of a Date Field in Kibana Controls Option List

Hi there,
Im completely new to Kibana and Elastic Stack (6.5.4) , all is running well as per your Install instructions
and my first steps in Kibana looks quite promising.

First thing I recognice is the sorting behaviour of fields formatted as Date.
Due to I'm used to common descending behaviours like in Excel etc.
Kibana is working a bit strange from it and for my needs because "Date" is sorted in order like below sample in Controls as an Option List (I noticed the experimantel stage)

What I need is like the below sample on the right.
I tried to find something in www that can help me but I'm lost.
Hope you can help me out here.
Thanks in advance
Brgds

Kibana | What I Need (Desc or Asc)
01.01.2019 | 09.01.2019
02.01.2019 | 08.01.2019
03.01.2019 | 07.01.2019
04.01.2019 | 06.01.2019
05.01.2019 | 05.01.2019
06.01.2019 | 04.01.2019
07.01.2019 | 03.01.2019
08.01.2019 | 02.01.2019
09.01.2019 | 01.01.2019
01.12.2018 | 11.12.2018
02.12.2018 | 10.12.2018
03.12.2018 | 09.12.2018
04.12.2018 | 08.12.2018
05.12.2018 | 07.12.2018
06.12.2018 | 06.12.2018
07.12.2018 | 05.12.2018
08.12.2018 | 04.12.2018
09.12.2018 | 03.12.2018
10.12.2018 | 02.12.2018
11.12.2018 | 01.12.2018

Following my mutation of the csv log:

input {
beats {
port => "5044"
}
}
filter {
csv {
columns => ["Date","Time","Country"]
separator => "|"
quote_char => "~"
}
mutate {
split => { "message" => "|" }
}
mutate {
add_field => {
"Date" => "%{[message][0]}"
}
}
mutate {
add_field => {
"Time" => "%{[message][2]}"
}
}
mutate {
add_field => {
"Country" => "%{[message][3]}"
}
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => ["test"]
}
}

Looks like maybe your Date field is not getting mapped as a Date in Elasticsearch? What do the mappings for you index look like?

Hi Bill,
yeah you're right it's mapped as String.

,"readFromDocValues":true},{"name":"Date","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false},{"name":"Date.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true

Now I add this line:
date {
match => ["Date", "dd-MM-yyyy HH:mm:ss"]
target => "Date"
}
but its still mapped as a String!?

Do I have need to restart Filebeat or does Filebeat recognice the change of the Field on the fly and update to given index!?
If not is there any chance to change the field type for my "old" index?
(sorry very beginners questions)
Thanks and regards

ahh I got it:
https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping.html

I will try the way to "reindex" into existing index
thats a got way to learn how to :slight_smile:

Anyway, thanks for reply
Regards

Hi Bill,
apart from what I yesterday told my Date field and even though my Time Field are still Strings and now I'm totally lost.
I tried all the pattern examples I found to map this fields from string to date but whatever I do
ES Index says its a String and sorting of Date is still worst.
How to map this simple fields as Date and Time?
The source is still a csv and the string of Date is 11.01.2019 (dd.mm.YYYY) and Time is 00:00:00 (hh:mm:ss)

What they told me is to map it this way:

date {
match => [ "Date", "dd.MM.YYYY" ] (tried also with : - / \ and spaces)
target => "Date"
}

date {
match => [ "Time", "HH:mm:ss" ] (tried also with . - / \ and spaces)
target => "Time"
}
Its not working!

In Kibana:
Time is = 0,285659722222222
Date = 10.01.2019 but not as Date

What I'm doing wrong?

Thanks again

I would ask a question in the Beats forum about Filebeat field formatting, as I am not an expert on that subject and that is really the source of your issue.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.