Issue with logstash - rabbitmq input plugin


(Azhar) #1

The aim is to have a logstash server consuming the msgs from a rabbitmq queue and flushing the messages to an elasticsearch DB.

My logstash.conf file content is as follows:

input {
  rabbitmq {
    codec => "line"
    host => "x.x.x.x"
    password => "xxxx"
    user => "xxxx"
    connection_timeout => 1000
    heartbeat => 30
    subscription_retry_interval_seconds => 30
    queue => "xxxx"
  }
}

filter {
  grok {
    match => [ "message", "%{IP:client_ip} (?<timestamp>[0-9]{4}-[0-9]{2}-[0-9]{2} [0-9]{2}:[0-9]{2}:[0-9]{2}) %{WORD:severity} (?<source>.*\:[0-9]*) (?<whitespaces>[ \t]*) (?<message>.*)" ]
  }
  mutate {
    strip => [ "whitespaces" ]
  }
}

output {
  elasticsearch {
    hosts => ["x.x.x.x:9200", "x.x.x.x:9200"]
    index => "test_logging"
    action => "index"
  }
}

I'm using the pika python driver to push msgs to the queue and can see from the rabbitmq management UI that the msgs are being immediately flushed from the queue...
But, I don't see any documents getting created in the Elastic search DB. Highly appreciate any help/leads... Thanks in advance!
To debug a little more, I've changed the logstash output to print the queue messages to console but couldn't capture any msgs there...


(Magnus B├Ąck) #2

So if you shut down Logstash the queue fills up without getting drained?


(Azhar) #3

Hi @magnusbaeck, Thanks for responding. I figured out the problem. The codec in the input plugin was supposed to be "plain". The logstash server was pulling the items from the rabbitmq queue but was not being able to parse them. Starting the logstash server manually with verbose and debug mode and redirecting the output to stdout helped me figure out that it was a parsing issue.


(system) #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.