Logstash Custom Filter: How to access, fields of csv being uploaded in a custom filter

Logstash Custom Filter: How to access, fields of csv being uploaded in a custom filter

I am uploading a csv and I'm creating a custom filter to perform the operations on fields in CSV.
I have no idea of accessing fields in a custom fields. Please guide me, if anyone has done this.

Or can I access the documents(Records) which are passing through filters?

I am afraid I do not understand your question. Could you please provide an example input as well as the expected output? What have you tried so far?

1 Like

input{
file{
path => "E:\test.csv"
start_position => "beginning"
}
}

filter{
csv{
columns => [ "id","name","age","country" ]
separator => ","
}
}

output{
elasticsearch{
hosts => "localhost:9200"
action => "index"
index=> "myidex"
workers => 1
}

Thanks for giving time first @Christian_Dahlqvist.

I am creating a custom logstash filter, and I am able to create a simple filter which adds a new column/field "isEligible" in elasticsearch.
So all I want is to access the "age" field in my custom filter, so as to take decision inside filter.

Thanks again.

When you are creating configuration, use the stdout output filter with a rubydebug codec to see the resulting events without having to go to Elasticsearch. Based on the csv filter you should have an 'age' field. You can use this in other filters and conditionals as described here.