I have deployed 3 logstash configuration files in my server which is processing 3 different kinds of log files(.xml,.log and .txt). I am using 3 logurier files(version 1.6) to fetch the log files and logstash(version 1.5.4) to process the log files. Daily I am getting tons of log files, log courier fetching all the log files from the location but my logstash skips many events.
I am running the setup in my redhat linux 3.10.0-229.e17.x86_64 version and JDK 1.7. Am I missing any thing to sort out this issue.
My logstash file:
input {
courier {
port => xxxx
transport => "tcp"
}
}
filter{
.....
}
output{
elasticsearch {
index => "logstash-server1-%{+YYYY.MM.dd}"
host => "XXXXXX"
cluster => "server_cluster"
protocol => "http"
document_id => "%{fingerprint}"
}
}
Logcourier File:
{
"general": {
"log level" : "debug",
"persist directory" : "<.logcourier file location>"
},
"network": {
"servers": [ "127.0.0.1:xxxx" ],
"transport": "tcp"
},
"files": [
{
"paths":[
"/Mylogdata/error/[0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9]/vg67-*.xml"
], "fields": { "type": "sss_log" }, "codec": { "name": "multiline", "pattern":"<regular expression>", "negate": true, "what": "next" }, "dead time": "1s",
} ]
}
Do I need to add any additional property to overcome this issue.. I am using the same configuration for all the 3 configuration files(Logstash and Logcourier)