Hello there,
we are using logstash to import data from csv files into elasticsearch. The logstash config worked so far but I was now trying to input the CSV files with stdin
instead of file
. The stdin
input works but I'm performing some mutate
, add_field
and update
on several columns and those actions do not work anymore with the stdin pipe.
Here's my example config:
input {
stdin {
id => "${LS_FILE}"
}
}
filter {
mutate {
add_field => { "kvasy_type" => "${KVASY_TYPE}" }
}
if "${LS_FILE}" == "xyz.csv" {
csv {
separator => ";"
columns =>
[
"name1",
"name2",
"name3",
"OID"
]
}
mutate {
add_field => {
"searchfield" => "%{name1} %{name2} %{name3}"
}
}
if [OID]{
mutate {
update => { "OID" => "n-%{OID}" }
}
}
}
}
output {
stdout {codec=>"json_lines"}
file{
path => "/home/kevin/output.csv"
codec => "json_lines"
}
}
That's just a snippet of the the complete config, I pipe several csv into logstash and set environment variables before to set the specifics column filters.
I start logstash with./logstash -f cfg.config < xyz.csv
. The output contains only the columns of the original csv, any mutate action seems not to be processed. I also tried to auto detect the columns with no change.
Does anyboy know why this doesn't work as desired?
Regards KK