Logstash reading recursively directories

Hi everyone,

in my environment I'm trying to read log files that are saved on the file system like this:

/home/andrea/mylogs/[subdir1....subdirN]/*.gz
/home/andrea/mylogs/[subdir1...subdirM]/*.gz
/home/andrea/mylogs/[subdir1...subdirP]/*.gz

With N, M, P, possibly different. And inside this structure there're all the log files i need to read.
So basically I have this structure and I have to read all the log.gz files with Logstash to import in my indexes.
I tried, searching on the web and in the Elastic community, to set the file input plugin for Logstash like this:

input{
    file {
      path => "/home/andrea/mylogs/**/*.gz"
      start_position => "beginning"
      sincedb_path => "/dev/null"
      mode => "read"
  }

But it looks like it can't actually find any file in the directories, I mean the pipeline goes up and running, and then it stops like it didn't find any file:

[INFO ] 2022-04-12 13:26:14.310 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[INFO ] 2022-04-12 13:26:19.445 [LogStash::Runner] runner - Logstash shut down.

My questions are:

  1. Do I have to gunzip the log files, to make them readable by Logstash? I saw people importing log files also in .gz format
  2. How can i specify the path for file, to make Logstash read recursively the directories I specified in the path configuration? (they're quite a lot...)

Any hint would be appreciated, thank you for your help and I hope you'll have a very nice Easter :slight_smile:

Andrea

Edit, i just moved all the logs up in the same directory, from the "root" folder
/home/andrea/mylogs/

find ./ -type f -print0 | xargs -0 mv -t .

And then started logstash, actually was simpler solution :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.