Logstash un-gzip array log configuration


(Александр Кобрин) #1

Hey everyone! I have logstash config, which forwards logs from RabbitMQ to elasticSearch. Something like this:

input {
    rabbitmq {
        ...
    }
}

filter {
    if [type] == "rabbitmq" {
        json {
            source => "message"
            target => "message"
        }
    }
}

output {
  elasticsearch {
    hosts => ["${ES_HOST}"]
    user => "${ES_USERNAME}"
    password => "${ES_PASSWORD}"
    sniffing => false
    index => "kit_events-%{[message][elasticsearch][index]}"
  }
}

And we were forced to compress logs on a fly, because they are spending too much traffic. Logs were moved into array and gzipped. What is the correct way of configuring un-gzipping and splitting array back into objects?

I did some research and found out that there is gzip_lines plugin and something on Ruby(?) to parse array, but I failed to implement it. Did anyone make something like this before?


(Lewis Barclay) #2

Not really sure I understand fully, but where are they zipped? On the source?


(Александр Кобрин) #3

Yea, python script sends data to RabbitMQ. And this data will be gzipped before sending.