Hi,
I've got a small issue with the updating of an index from a csv. The PowerShell script I've written to convert a xml file into a csv file works by copying the xml file to the local storage, converting it and writing the new csv to a separate folder. I then use file input in Logstash to pull in the csv file, mutate it and push it to its index in ES. All works as expected except for one small issue. When the scheduled PS script runs again, it overwrites the csv with the latest copy. Logstash isn't seeing this as a change thus isn't updating the index. Not sure how I can get around this. Deleting the original csv before updating with the new one doesn't seem to make sense as it would have the same file name. Config file shown below.
input {
file {
type => "audit"
path => "C:/Target_Directory/*.csv"
start_position => beginning
sincedb_path => "/dev/null"
}
}
filter {
if [type] == "audit" {
csv {
separator => "|"
columns => [ "Provider","EventID","EventName","Version","Source","Level","Opcode","Keywords","Result","TimeCreated","Correlation","Channel","Computer","ComputerUUID","Security","ProviderGuid","SubjectIP","SubjectUnix","SubjectUserSid","SubjectUserIsLocal","SubjectDomainName","SubjectUserName","ObjectServer","ObjectType","HandleID","ObjectName","AccessList","AccessMask","DesiredAccess","Attributes","OldDirHandle","NewDirHandle","OldPath","NewPath","InformationSet"
]
}
#Drop the first line of the import
if [Computer] == "Computer" {
drop { }
}
#Drop Successful audits (we only care about failures)
if [Result] == "Audit Success" {
drop { }
}
Drop any entries for PCs (we only care about usernames)
if [SubjectUserName] =~ "$" {
drop { }
}
date {
match => [ "TimeCreated", "dd/MM/yyyy HH:mm:ss" ]
target => "@timestamp"
timezone => "Europe/London"
}
mutate {
remove_field => [ "Provider","EventID","Version","Source","Level","Opcode","Keywords","Correlation","Channel","ComputerUUID","Security","ProviderGuid","SubjectUnix","SubjectUserSid","SubjectUserIsLocal","ObjectServer","HandleID","AccessList","AccessMask","DesiredAccess","Attributes","OldDirHandle","NewDirHandle","OldPath","NewPath","InformationSet" ]
remove_field => [ "message" ]
gsub => [ "Result", "Audit Success", "Success",
"Result", "Audit Failure", "Fail" ]
split => { "Computer" => "/" }
add_field => { "SVM_Name" => "%{[Computer][1]}" }
}
}
}
output {
if [type] == "audit" {
elasticsearch {
index => "audit-%{+YYYY.MM.dd}"
hosts => ["elk.server.com:9200"]
#user => elastic
#password => changeme
}
}
}
Any suggestions as to why it fails to see the change? I'm assuming it has something to do with the file being replaced and not amended because other csv imports I have set up in a similar fashion work fine. The only difference is on the others, the csv gets amended, not replaced.
Thanks.