Logstash date filter parsing date

I am using the date filter but it is not working . Kindly help me out.

Below is the config file. I am using a csv in s3 to ingest data via logstash. I am using ELK 7

filter {
csv {
separator => ","
columns => [ "Date", "Source_Name", "Store_Name", "Zip_Code", "Category", "Subcategory", "PRODUCT_TYPE", "Stock_Status", "Additional_Text_1", "Additional_Text_2", "UOM", "Size", "Size", "Reg_Price_Alt", "UOM_Alt", "Size_Alt" ]

 date {
    match => ["Date", "dd/MM/yyyy HH:mm"]
    target => "Date"

the csv date format is "25/05/2019 00:05" . But still below is the mappings seen on kibana

"Date": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256

this is kibana output on discover

t Date 2019-06-21T03:00:00.000Z


What issue do you have with that? It looks like the date filter parsed it and it got mapped to text as requested by the mapping.

Thanks for the reply . It should be date instead of text in the type column .

OK, so create a mapping for the index that tells it that. I believe it will have to be a new index.

Thanks, I have created a template with all the mappings. That fixed my issue. Thanks.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.