Convert a date format in logstash pipeline

hello , i am trying to load a csv file into elasticsearch via logstash .this is a snippet of my data sets

transaction_date,customer_name,product_name,product_id,price_unit,quantity,fidelity_card_id,discount,category
2-24-2017,Roberto Michel,Taillefine aux fruits 0% Fraise - Danone - 500 g (4 x 125Â g),1,556.26,256,4.175E+15,9,yaourt
5-15-2016,Reiko Branchet,"Activia Fibre - Danone - 171,5 g (150 g+21,5Â g)",2,612.98,163,3.74284E+14,8,yaourt

i used in the configuration of the pipeline mutate and convert to change string fields into numeric ones because elasticsearch store fields by default as strings but the problem appeared when i used this

date {
match => {transaction_date , "MM dd yyyy"} }

in order to convert this field from string into date format
this is th error i 've got when running logsatsh

Cannot create pipeline {:reason=>"Expected one of #, => at line 26, column 29 (byte 713) after filter {\n csv {\n separator => ","\n\t#transaction_date,customer_name,product_name,product_id,price_unit,quantity,fidelity_card_id,discount\n columns => ["transaction_date","customer_name","product_name","product_id","price_unit","quantity","fidelity_card_id","discount"]\n} \nmutate\t{\t\t\nconvert =>{"discount" => "integer"\t}\t\t\t\t\t\nconvert =>{"price_unit" =>"float"}\t\t\t\t\t\t\t\nconvert =>{"product_id" =>"integer"}\t\t\t\t\t\t\t\nconvert =>{"quantity"=> "integer"}\t\t\t\t\t\n\n}\n\ndate {\n\tmatch => {transaction_date "}
2017-07-28 19:26:49,905 Api Webserver ERROR No log4j2 configuration file found. Using default configuration: logging only errors to the console.

this my config file (the filter part)

filter {
csv {
separator => ","
#transaction_date,customer_name,product_name,product_id,price_unit,quantity,fidelity_card_id,discount
columns => ["transaction_date","customer_name","product_name","product_id","price_unit","quantity","fidelity_card_id","discount"]
}
mutate {
convert =>{"discount" => "integer" }
convert =>{"price_unit" =>"float"}
convert =>{"product_id" =>"integer"}
convert =>{"quantity"=> "integer"}

}

date {
match => {transaction_date ,"MM dd yyyy"}}
}

please any help !!! how could i change the type of this field ??

any help !!!! is there anyone who had faced the same problem and resolved it ???

The date format has to allogn with the date string you're getting from the filter

so what am i supposed to do ???

Any help please need to modify this field into date i need it in the creation of the dashboard later on !!!

Please refer date filter plugin...
https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html

i did but found nothing to solve my promblem .i don't get why the date filter i used does not effect the type of the field stored at elasticsearch ??? other fields has changed but only the date field doesn't change. when running logstash no error appears ..

Please recreate the existing index

i deleted the index and recreate it but nothing changed

Can you check the mapping of the index

this the transactin_date part of the result :

    "transaction_date": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      }
    }
  }
}, 

it still with a string format

can i create a template where i define my mapping then use it in the pipeline configuration ???

You have to delete your current index or create another index and try to insert the data again

i did create an otherr index but also nothing works

do you have any example of a config file of logstash where u did use date filter and it worked ??? if u do can u send me the example ????

The below config worked for me ..

date {
match => [ "timestamp_string" , "dd.MM.yyyy HH:mm:ss.SSS" ]
target => "new_timestamp"
}

1 Like

thank you for your reply ,
i did it and it worked

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.