Archive index to gzip - cant decompressed


(boaz) #1

Hey all.
im trying to archive logs every 1hour to gzip - its working and saving every 1 hour to gzip.
while im trying to decompressed it via node js \ linux \ gunzip \ cygnwin - got an error
" Gzip outputting invalid compressed data "
"gzip: logs-2018.12.10.07.log.tar.gz: invalid compressed data--format violated"
i tried to cd into the gzip via cygnwin (linux) - i saw to json at invalid format.
my logstash config is
input {
tcp {
port => 5556
}
udp {
port => 5566
}
}
filter {
csv {
separator => ","
columns => [ "os","host_name","client_time","full_server_time","process_id","process_name","process_path","application_name","protocol",
"status","source_port","destination_port","direction","file_path","x_cast","state",
"source_ip","destination_ip","sequance_number","sub_sequance_number","user_name","mog_counter"
,"destination_path","reason","image_path","image_name","parent_path","parent_name","chain_array"
]
}
mutate {convert => ["process_id","integer"]}
mutate {convert => ["source_port","integer"]}
mutate {convert => ["destination_port","integer"]}
mutate {convert => ["sequance_number","integer"]}
mutate {convert => ["mog_counter","integer"]}

}
output {
  elasticsearch {
    hosts => "localhost:9200"
    index => "logs-%{+YYYY.MM.dd}"
    template => "C:\Cyber20\loginsert\application\logstash_config\index_template.json"
    template_overwrite => "true"
      }
  file {
    path => "c:/cyber20/logarchiver/logs-%{+YYYY.MM.dd.HH}.gz"
    gzip => true
  }
}

I tried to read the gzip with input file , and gzip lines codec - nothing happened it read the config but cant see the logs.
logstash-6.4.2
ES- 6.42
kibana - 6.42

please help :slight_smile:
thanks.


(boaz) #2

Bump
tried with version 6.5.x also - same issue
tried to add to file path codec = > json_lines - same issue.


(boaz) #3

Bump


(boaz) #4

no one ? :frowning:


(boaz) #5

Bump


(boaz) #6

bump


(boaz) #7

bump


(boaz) #8

bump


#9

This is a known issue: https://github.com/logstash-plugins/logstash-output-file/issues/61
Try to get it fixed from there, I gave up a long time ago :slight_smile:

EDIT:

i tried to cd into the gzip via cygnwin (linux) - i saw to json at invalid format.

My data was very much intact, but it was missing gzip footer (or something like that). Are you sure the json is invalid, if you zcat it?


(boaz) #10

tried with zcat and zlib via node.js same issue.
"outputting invalid compressed data "


#11

Yes, but that still doesn't indicate that the JSON data inside the archive would be invalid. It just says that the compressed package is invalid, like in my case.
Anyway, you have to take it to the developer or try to fix it yourself. There does not seem to be anything wrong with your config.


(system) closed #12

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.