Wrong values are populated into date field using Logstash

(Ramanand) #1

Hello,
I could see the date field are populated wrongly into Elasticsearch. I am generating CSV file and inserting the data through logstash conf file. Below is the screenshot where I have highlighted the date columns which are wrongly inserted for ex: In CSV the vendor_start_dt, vendor_end_dt and event_tm columns are 01-feb-2019, but when I see the data through kibana, it shows as 01-Jan-2019. This is the case for all the values.

Below is the contents from conf file of logstash

input {
file {
path => "/data01/logstash/data/daily_final_report_201902110807.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => [ "filename","fileformat","filesize","data_received_dt","vendor_start_dt","vendor_end_dt","bucketname","createdt","filetype","event_tm","processing_status" ]
}
}
output {
elasticsearch {
hosts => "addd.xxxx.local:9200"
index => "rr_log_gen"
document_type => "_doc"
workers => 1
}
stdout {}
}

So let me know what was the wrong in this case?

(Ry Biesemeyer) #2

What does your Elasticsearch mapping look like for those fields?

GET addd.xxxx.local:9200/rr_log_gen/_mapping/_doc

I've previously seen issues where a user configured a date field with the format including D (day of year) instead of d (day of month), causing similar effect.

(Ramanand) #3

Below is the mappings listed. I have issue in fields like vendor_start_dt, vendor_end_dt and event_tm

{
"rr_log_gen" : {
"mappings" : {
"_doc" : {
"properties" : {
"@timestamp" : {
"type" : "date"
},
"@version" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"bucketname" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"createdt" : {
"type" : "date"
},
"data_received_dt" : {
"type" : "date"
},
"event_tm" : {
"type" : "date",
"format" : "yyyy-mm-dd HH:mm:ss"
},
"fileformat" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"filename" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"filesize" : {
"type" : "long"
},
"filetype" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"host" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"message" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"path" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"processing_status" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"tags" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"vendor_end_dt" : {
"type" : "date",
"format" : "yyyy-mm-dd HH:mm:ss"
},
"vendor_start_dt" : {
"type" : "date",
"format" : "yyyy-mm-dd HH:mm:ss"
},
"vendorname" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
}
}
}
}

(Ry Biesemeyer) #4

What do those columns look like in your CSV?

(Ramanand) #5

I have attached the screenshot in my initial topic itself. I am using shell script to generate the date column with the format +%Y-%m-%d %T and if we viewin csv file, it shows the data as m/d/yyyy hh:mi:ss format.

(Ry Biesemeyer) #6

The dates in your screenshot are not of the format you say you have specified. Sometimes visual editors like Excel can present data in a different way than the raw bytes. Please paste the raw bytes from your CSV (perhaps open it with Notepad or some other basic editor that doesn't manipulate presentation).

(Ramanand) #7

The csv file has YYYY-MM-DD HH:MI:SS format
[ 2018-08-01 00:00:00 ]

#8

Really? Your mapping has lower case mm for both month and minute?

(Ramanand) #9

yes!!

#10

That seems odd. Unfortunately I do not have an elasticsearch node I can test with. I really should get one built out.

(Ramanand) #11

I have a doubt, Can we exclude the HH:MI:SS alone from the field and populate into new field with mappings as date

(system) closed #12

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.