Hi All,
We are trying to upload a csv file where it has a date field, so using date filter we are converting the field from string to date, the problme we are facing here is,
few fields has "proper date" few fields are "NA" so we planned to use a IF condition in logstash to convert only date field and remove the NA field but it was not successful.
Here is my config file:
input {
file {
path => "/opt/installables/csv/data.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
codec => plain {
charset => "ISO-8859-1"
}
}
}
filter {
csv {
separator => ","
columns => ["Asset No","Asset type","Asset in Store","Critical Asset","Assigned Flag","Assigned to","Licence till","Software Type","Software Provider","Device","Model","Manufacturer","Version","Installation Date","Licence Expired","Under Warranty","End of life","Under AMC","Compliance","Decommissioned","Unpatched Software"]
}
if [Installation Date] == "NA" {
mutate{
remove_field => ["Installation Date"]
}
}
else {
date {
match => ["Installation Date", "YYYY"]
}
}
}
output {
elasticsearch {
hosts => "1.1.1.6:9200"
index => "assetmgmt"
}
Instead of sending to elasticsearch if i use rubydebug i can see the results which are expected, looks like there is a problem while indexing.
have done proper mapping in kibana as well.
This is wat i did in kibana
PUT assetmgmt
{
"mappings": {
"doc": {
"properties": {
"Installation Date": {
"type": "date",
"format": "year"}
}
}
}
}
Instead of "year" i had tried "YYYY" also, didnt help
Any advice please, i'm breaking my head on this from past one week.
Thanks
Gauti