Hi Guys,
I have multiple CSV and I want to ingest into elasticsearch with logstash.
Here is my logstash config :
file {
path => '/tmp/email/*.csv'
type => 'testcsv'
# start_position => 'beginning'
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
autodetect_column_names => "true"
}
grok {
match => {
"path" => "%{GREEDYDATA:filepath}/%{GREEDYDATA:filename}\_%{GREEDYDATA:filetime}\.csv"
}
}
mutate {
gsub => [
"filename", "[\s?\\?#-]", "."
]
lowercase => [ "filename"]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
sniffing => true
manage_template => false
index => "%{[filename]}-%{+YYYY.MM.dd}"
}
}
My logstash config has successfully index first csv file.
But the other file generate index with the same column name in first file.
Please share tips to index for automatic create index with filename and column header.
Thanks before.