Hi guys,
I'm trying to work around a logstash-filter-csv issue. The line needs to be cleaned from whitespace and quotes before the CSV parser runs.
My current issue is probably that I don't know the "default field name" before it gets parsed? Could you please give me a hint?
Thanks a bunch!
input {
file {
path => "/home/user/folder/*.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
mutate {
gsub => [
'message', '\"', '',
'message', ',\s+', ','
]
}
csv {
separator => ","
columns => ["date","time","desc","long"]
}
mutate {
add_field => {
"timestamp" => "%{date} %{time}"
}
}
date {
match => ["timestamp", "MM/dd/yyyy HH:mm:ss"]
remove_field => ["timestamp"]
}
mutate {
remove_field => ["desc"]
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "project"
}
stdout {}
}