Unable to read .gz files in logstash


(Clash Of Clanss) #1

Hi,

I am trying to read .gz files in logstash input plugin but I can't read gunzip files in logstash. When I try to read those log files it throws error message like this,

A plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::File path=>["/test.txt"], start_position=>"beginning", sincedb_path=>"/dev/null", codec=><LogStash::Codecs::GzipLines charset=>"UTF-8">, stat_interval=>1, discover_interval=>15, sincedb_write_interval=>15, delimiter=>"\n">

Error: Object: application.log.2018-05-14-01.eradar-data-service-prod-1e-5af1c225.us-east-1.amazon.com.gz is not a legal argument to this wrapper, cause it doesn't respond to "read". {:level=>:error}

LogStash.conf file:
file {
path => "/test.txt" (this file contains list of gunzip files)
start_position => "beginning"
sincedb_path => "/dev/null"
codec => "gzip_lines"
}

and also tried path => "/logfile.gz" directly but both are not working it throws same error message.
How to solve this problem, please share your solutions. I refered all the similar questions but none of the solutions solve my issue.

Thanks,


(Ry Biesemeyer) #2

The path directive enables you to specify a glob-type path to files that will be read (e.g., application.log.*.gz).


(Clash Of Clanss) #3

Sorry, I don't understand your solution. can you explain some more detail?
Thanks.


(Ry Biesemeyer) #4

When you specify a path that matches a file, Logstash will attempt to ingest that file. Since you are giving a path to a plain-text file, and it is trying to read it with gzip, it fails. You need to give it a glob-style path that matches the files you want to ingest.


(Clash Of Clanss) #5

I also tried that method but it does not work again same error message.


(Guy Boertje) #6

If you use the latest file input v4.1.3 you can read gz files directly without the GzipLines codec.
Here is a simple test config:

input {
    file {
        path => "/Users/guy/tmp/testing/logs/pan*.gz"
        sincedb_path => "/Users/guy/tmp/testing/gz-local-read-file-mode.sdb"
        discover_interval => 40
        stat_interval => 0.1
        mode => "read"
        file_completed_action => "log"
        file_completed_log_path => "/Users/guy/tmp/testing/gz-local-read-file-mode-completed.log"
        file_sort_by => "path"
    }
}

output {
  stdout { codec => dots }
}

(Clash Of Clanss) #7

My Logstash version - 2.1.0


(Christian Dahlqvist) #8

That is a very old version. I would recommend that you upgrade.


#9

Is there a recommended way to process around 100,000 files without having logstash die?


(Guy Boertje) #10

Not yet. I have some experimental code that "feeds" batches of discovered files to the processing stage but it is not ready yet.

I hope that you are not tailing hundreds of thousands of files.


(Clash Of Clanss) #11

Hi,

I can't read gzip files after updating file input plugin.

Thanks,
Arun


(Guy Boertje) #12

The file input at v4.1.X has test file fixtures and test that verify the code can read gz files - these fixtures, test and the actual zipfile reading code are identical to the S3 input gz reading implementation.

You are experiencing a different outcome - perhaps because you have some differences in your setup. You will need to uncover what these are before anyone can help.

If the LS logs show some errors or logging lines to do with filewatch.readmode.handlers.readzipfilethen post them here.

Have you tested a fresh file gzipped with tar cvzf somefile.gz /some/pathto/somefile.x?


(Clash Of Clanss) #13

Hi,

I am not reading logs from S3. so is it okay?


(system) #14

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.