Convert a date format in logstash pipeline

(marwa) #1

hello , i am trying to load a csv file into elasticsearch via logstash .this is a snippet of my data sets

2-24-2017,Roberto Michel,Taillefine aux fruits 0% Fraise - Danone - 500 g (4 x 125Â g),1,556.26,256,4.175E+15,9,yaourt
5-15-2016,Reiko Branchet,"Activia Fibre - Danone - 171,5 g (150 g+21,5Â g)",2,612.98,163,3.74284E+14,8,yaourt

i used in the configuration of the pipeline mutate and convert to change string fields into numeric ones because elasticsearch store fields by default as strings but the problem appeared when i used this

date {
match => {transaction_date , "MM dd yyyy"} }

in order to convert this field from string into date format
this is th error i 've got when running logsatsh

Cannot create pipeline {:reason=>"Expected one of #, => at line 26, column 29 (byte 713) after filter {\n csv {\n separator => ","\n\t#transaction_date,customer_name,product_name,product_id,price_unit,quantity,fidelity_card_id,discount\n columns => ["transaction_date","customer_name","product_name","product_id","price_unit","quantity","fidelity_card_id","discount"]\n} \nmutate\t{\t\t\nconvert =>{"discount" => "integer"\t}\t\t\t\t\t\nconvert =>{"price_unit" =>"float"}\t\t\t\t\t\t\t\nconvert =>{"product_id" =>"integer"}\t\t\t\t\t\t\t\nconvert =>{"quantity"=> "integer"}\t\t\t\t\t\n\n}\n\ndate {\n\tmatch => {transaction_date "}
2017-07-28 19:26:49,905 Api Webserver ERROR No log4j2 configuration file found. Using default configuration: logging only errors to the console.

this my config file (the filter part)

filter {
csv {
separator => ","
columns => ["transaction_date","customer_name","product_name","product_id","price_unit","quantity","fidelity_card_id","discount"]
mutate {
convert =>{"discount" => "integer" }
convert =>{"price_unit" =>"float"}
convert =>{"product_id" =>"integer"}
convert =>{"quantity"=> "integer"}


date {
match => {transaction_date ,"MM dd yyyy"}}

please any help !!! how could i change the type of this field ??

(marwa) #2

any help !!!! is there anyone who had faced the same problem and resolved it ???

(Joseph Johney) #3

The date format has to allogn with the date string you're getting from the filter

(marwa) #4

so what am i supposed to do ???

(marwa) #5

Any help please need to modify this field into date i need it in the creation of the dashboard later on !!!

(Joseph Johney) #6

Please refer date filter plugin...

(marwa) #7

i did but found nothing to solve my promblem .i don't get why the date filter i used does not effect the type of the field stored at elasticsearch ??? other fields has changed but only the date field doesn't change. when running logstash no error appears ..

(Joseph Johney) #8

Please recreate the existing index

(marwa) #9

i deleted the index and recreate it but nothing changed

(Joseph Johney) #10

Can you check the mapping of the index

(marwa) #11

this the transactin_date part of the result :

    "transaction_date": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256

it still with a string format

(marwa) #12

can i create a template where i define my mapping then use it in the pipeline configuration ???

(Joseph Johney) #13

You have to delete your current index or create another index and try to insert the data again

(marwa) #14

i did create an otherr index but also nothing works

(marwa) #15

do you have any example of a config file of logstash where u did use date filter and it worked ??? if u do can u send me the example ????

(Joseph Johney) #16

The below config worked for me ..

date {
match => [ "timestamp_string" , "dd.MM.yyyy HH:mm:ss.SSS" ]
target => "new_timestamp"

(marwa) #17

thank you for your reply ,
i did it and it worked

(system) #18

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.