Hi everyone,
in my environment I'm trying to read log files that are saved on the file system like this:
/home/andrea/mylogs/[subdir1....subdirN]/*.gz
/home/andrea/mylogs/[subdir1...subdirM]/*.gz
/home/andrea/mylogs/[subdir1...subdirP]/*.gz
With N, M, P, possibly different. And inside this structure there're all the log files i need to read.
So basically I have this structure and I have to read all the log.gz files with Logstash to import in my indexes.
I tried, searching on the web and in the Elastic community, to set the file input plugin for Logstash like this:
input{
file {
path => "/home/andrea/mylogs/**/*.gz"
start_position => "beginning"
sincedb_path => "/dev/null"
mode => "read"
}
But it looks like it can't actually find any file in the directories, I mean the pipeline goes up and running, and then it stops like it didn't find any file:
[INFO ] 2022-04-12 13:26:14.310 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[INFO ] 2022-04-12 13:26:19.445 [LogStash::Runner] runner - Logstash shut down.
My questions are:
- Do I have to gunzip the log files, to make them readable by Logstash? I saw people importing log files also in .gz format
- How can i specify the path for file, to make Logstash read recursively the directories I specified in the path configuration? (they're quite a lot...)
Any hint would be appreciated, thank you for your help and I hope you'll have a very nice Easter
Andrea