Add field with logstash in csv file

Hi,
I create a file config.conf to insert data.csv in elasticsearch
i need to remove filed and add in empty filed
this is my config.conf:

input {
  file {
    path => "/home/salma/Documents/data.csv"
    start_position => "beginning"
   sincedb_path => "/dev/null"
  }
}
filter {
  csv {
      separator => ","
#Date,Open,High,Low,Close,Volume (BTC),Volume (Currency),Weighted Price
     columns => ["Open","High","Low","Close","Volume (BTC)", "Volume (Currency)" ,"Weighted Price"]
  }
mutate {
    add_field => { "foo_%{pred}"}
  }
}
output {
   elasticsearch {
     hosts => "http://localhost:9200"
     index => "bitcoin-prices"
  }
stdout {}
}

but nothing change in elastcsearch
can help please
thank you!

add_field => { "foo_%{pred}"}

add_field needs two things: the name of the new field and its desired value. It should look like this:

add_field => {
  "field name" => "field value"
}

Secondly, %{pred} doesn't make sense since your events don't have a field named pred.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.