Read csv file and insert/update the index based on condition in logstash

Hi folks,
I have quick qusetion.
My scenario is read csv file and insert/update the index based on condition
here my query,
I need to filter 'age' field from csv file and check if the age > 18, then only I should update/insert field in the index/total index

input {
file {
path => "C:/ElasticSearch/logstash-5.3.2/logstash-5.3.2/data/createmember.txt"
start_position => "beginning"
type => "createmember"
}
}

filter {
csv {
columns => ["memberid","firstname","lastname","age","status"]
separator => ","
}

             ruby{ 
                   #do something to validate input age(age>18) to set status in index 
                     **?????**
                  }

}

output {
stdout { codec => rubydebug }
elasticsearch{
hosts => ["localhost:9200"]
index => "createmember"
document_type => "createmember"
document_id => "%{member_id}%{first_name}%{last_name}%{age}%{status}
doc_as_upsert => "true"
}
}

thanks

You don't need a ruby filter.

if [age] > 18 {
  # whatever
}

This assumes that age is a numeric field. You can tweak the csv filter to make that column numeric or you can use a mutate filter to convert the field value.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.