Hey all.
im trying to archive logs every 1hour to gzip - its working and saving every 1 hour to gzip.
while im trying to decompressed it via node js \ linux \ gunzip \ cygnwin - got an error
" Gzip outputting invalid compressed data "
"gzip: logs-2018.12.10.07.log.tar.gz: invalid compressed data--format violated"
i tried to cd into the gzip via cygnwin (linux) - i saw to json at invalid format.
my logstash config is
input {
tcp {
port => 5556
}
udp {
port => 5566
}
}
filter {
csv {
separator => ","
columns => [ "os","host_name","client_time","full_server_time","process_id","process_name","process_path","application_name","protocol",
"status","source_port","destination_port","direction","file_path","x_cast","state",
"source_ip","destination_ip","sequance_number","sub_sequance_number","user_name","mog_counter"
,"destination_path","reason","image_path","image_name","parent_path","parent_name","chain_array"
]
}
mutate {convert => ["process_id","integer"]}
mutate {convert => ["source_port","integer"]}
mutate {convert => ["destination_port","integer"]}
mutate {convert => ["sequance_number","integer"]}
mutate {convert => ["mog_counter","integer"]}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "logs-%{+YYYY.MM.dd}"
template => "C:\Cyber20\loginsert\application\logstash_config\index_template.json"
template_overwrite => "true"
}
file {
path => "c:/cyber20/logarchiver/logs-%{+YYYY.MM.dd.HH}.gz"
gzip => true
}
}
I tried to read the gzip with input file , and gzip lines codec - nothing happened it read the config but cant see the logs.
logstash-6.4.2
ES- 6.42
kibana - 6.42
please help
thanks.