Hi all,
I have a data format as shown in the attached image -
I'm able to import the data via the CSV input plugin which works great. However, I'm stuck on the next step which is to map the rows to numbers so that the data can be analyzed in kibana. The issue is that the values should generally be numbers as in the second two columns, however, if there is an error with the data point at some point in time, an error code will be generated as shown in the last two columns. Is there some way to map the columns to number datatype while also handling the occasional case where the value will be a string?
For reference, below is my current logstash config which needs to be expanded upon
input {
file {
path => "C:/Users/zach/Downloads/pdr*.csv"
start_position => "beginning"
sincedb_path => "NUL"
}
}
filter {
csv {
separator => ","
autodetect_column_names => true
autogenerate_column_names => true
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => ["localhost:9200"]
index => "pdr-data"
}
}