Hi friends, I have a question about filters.
I have a conf file that works perfectly for sending active directory logs to elasticsearch.
In my log, I have a field named "event_data.TargetUserName and it has a registration number.
In my company, all users have a registration number.
So, this is a sample of my csv file :
sAMAccountName;displayName
C583;jane Doe
C090;John Doe
C587;JMichael Jackson
and my logstash conf file (I use logstash 2.4 and elasticsearch 5.2.2)
input {
kafka {
zk_connect => "192.168.18.15:2181"
group_id => "logstash-application"
topic_id => "ActiveDirectory-Application-Logs"
reset_beginning => "false"
consumer_threads => 1
codec => json {}
}
output {
elasticsearch {
hosts => ["192.168.18.15:9200"]
index => "logstash-application-%{+YYYY.MM.dd}"
}
I don't know which filter using for matching the log field "event_data.TargetUserName" with the field "sAMAccountName" of my csv file and how to add a field named "userName"
I tried this but without effect :
filter {
if [event_data.TargetUserName] == "*" {
csv {
source => "/etc/logstash/mutate/ExportADLDS.csv"
columns => ["sAMAccountName","displayName"]
separator => ";"}
add_tag => ["userName"]
source => "[event_data][TargetUserName]"
target => "userName"
add_field => ["{[sAMAccountName]}", "%{[displayName]}"]
}
}
A great thanks for your help
Fayce