I have Rolling file appenders in my java code which creates log files like - application.log, application.log.1.gz,application.log.2.gz so on. I want to index all of these and make sure messages from log files are read.
PFB logstash.config, but this seems not working.Kindly suggest
input{
file {
path => [
"/path/to/application.log",
"path/to/application.log.1.gz
]
codec => gzip_lines {}
}
filter{
if [path] =~ /^/application.log(?:.\d+)?.gz(?:.|%{NUMBER})?$/ {
grok {
match => { "message" => "Published symbol:%{NUMBER:Published_count:int}" }
}
}
}
You will need a file input with the default codec to read "/path/to/application.log", and a second input with a gzip_lines codec to read the rotated files.
However, unless you are working with a static backup of live logs the normal use case would be just to read application.log. When that is rotated and compressed logstash will already have read the data and does not need to re-read the gzipped file.
Got it. Also i need to apply grok filters for specific log file from a list.
Example:
file {
path => [
"/var/app/log/process1.log",
"/var/app/log/process2.log",
"/var/app/log/process3.log"
]
start_position => "beginning"
discover_interval => 10
codec => plain { }
}
However my
filter {
if [path] == "/var/app/log/process1.log" {
//code inside this is never called
grok{
//logic resides here
}
}
}
any suggestions?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.