Hi team,
I am a newbie and I am using ELK as a reporting engine for the data from csv exported files.
I am currently using 2.3.2 version of logstash to load csv data from a remote server through file beat. Since my input data is structured csv files, I map each of the column to index fields in logstash using csv filter and have created separate elatic search indexes for each of my csv files.
My problem is, in few cases when there is change in column order or data type in my csv files there are exceptions in the input due to the mismatch in the filter and the records sent from the file beat goes missing. Since the input data is huge, I am not able to capture these during its occurrence.
Could you please suggest the best suggested way to capture these errors and store them in an index and/or alert when the error occurs.
Appreciate your patience in reading and Thanks in advance !!
My logstash config is below,
input {
beats{
port => 5056
type => csv
}
}
filter
{
if [type] == "csv" and [source] =~ "CPU_Usage.csv" {
csv{
columns => ['ID','HOSTNAME','CHECKTIME','CPUUSAGE']
add_field => { "indexName" => "cpuusage"}
}
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "%{indexName}-%{+xxxx.ww}"
}
stdout {}
}