Hi All
I am running multiple pipelines in the 5.x instance. All the logs are getting accumulated in log file.
This is well and good.
There are scenarios where pipeline is not bringing records from input to output. The reason for this is either wrong data in csv,email server down etc.
But for each data source, i am missing some records to output.
So how should i identify from the log file for which conf file records is missing for out put
In the below example my output email is having wrong password,
1 .how can i identify that this has happened for which pipeline as everything is going to same log file
Also in some cases my output is http, in those cases either my URL is wrong or the syntax is wrong. In those cases also how to identify which pipeline has cause this issue.
2. I also want to identify which records is getting missed in output. For example the input will read the data from source but when it comes to output it logs an error in logs. But how to identify this particular log message is for which record in source?
My Example Conf file is this
input{
file {
path => ".....csv"
sincedb_path => "...xyz.txt"
start_position => "beginning"
type => "excel"
}
}
filter {
csv {
columns => ["date1","value1","cumvalue"]
separator => ","
}
}
output {
email {
to => "abc"
body => "Testing the Email Body"
address => "smtp.xyz.com"
port => xxx
username => "abc@xyz.com"
password => "welcome123"
use_tls => true
}
}
I have tried the following
-
Add a new field to capture the error as mutate filter. But for output failure one this event does not create
-
I also tried adding "tags" in input, for this also the message which is having errors does not create events in output which is correct but does not show up the reason in the log file
How should i trace the above two reason. Any suggestions?