Date Field as "Date" Time as "Time" - Why this so hard to do in Logstash?

Hi all,

I mess arround with my csv and a date field formatted as "DD.MM.YYYY"

grafik

But whatever I do Logstash Filter will ignore all --> still "String"
Even the data_formatted plug in will not work as I expected.

What I want is this: but instead of @timestamp as Date I want the csv Date as "Date" --> but no way and I have no ideas left!

grafik

Why the ELK Stack have so many problems / troubles by managing the most obvious informations like Date and Time?
Why this so hard to do the date {} Filter in Logstash?
This must be a simple task, but it's not!

Please what can I do to manage this?
Thanks

If you try

date { match => [ "Date", "dd.MM.YYYY" ] target => "Date" }

what issues do you have with the result?

Hi,
thanks for reply!
I tried, result is:
grafik

Any suggestions ?

Note, following Syntax is not working in my conf file:

date {
match => [ "Date", "dd.MM.YYYY" ]
target => "Date"
}

That is what I tried before, hm hm, maybe quiet sensitiv somewhere between the spaces...

Regards

In what way does it not work?

Note that once a mapping is established in elasticsearch, such as Date being a string, it cannot be changed without creating a new index. So when you are debugging things like this you need to keep deleting the index as you try each iteration in logstash.

Thats what I did before try yours!
Clean all ES and Filebeat

The field Date still mapped as a String.
Believe me, I'm getting frustraded and I dont know why the hell this is not working.
I tried everything I found here in this discuss forum even the plugin date_formatted which will give
access to change the pattern of a field.
Nothing works.
My wish: An Optional "Auto-Type" for fields they look like what they probably are no matter from what format
"Date is Date" "Time is Time" "Number is Number" etc.

Here is the interesting part

csv {
columns =>["Date","Time","Year","Month","Day"]
separator => "|"
quote_char => "~"
}
mutate {split => { "message" => "|" }}
mutate {add_field => {"Date" => "%{[message][0]}"}}
mutate {add_field => {"Time" => "%{[message][1]}"}}
mutate {add_field => {"Year" => "%{[message][2]}"}}
mutate {add_field => {"Month" => "%{[message][3]}"}}
mutate {add_field => {"Day" => "%{[message][4]}"}}
mutate {convert => ["Year","integer"]}
mutate {convert => ["Day","integer"]}
date {
match => [ "Date", "dd.MM.YYYY" ]
target => "Date"
}

logstash convert everything, all good but Date! Even with your advice!?

Change the target to a new name

target => "DateABCD"

Ingest some documents, do the index pattern refresh in Kibana and see what you get for that field.

No Sir,

even the field DataABCD is missing !!!

Any suggestions?

Finaly Made it!
I got some syntax errors while my added fields.
Your advice works well :slight_smile:
However I hope there will be an AutoType for Fields in future updates.
That will do things easier when working with bunch of different files.

Thanks Regards

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.