The aim is to have a logstash server consuming the msgs from a rabbitmq queue and flushing the messages to an elasticsearch DB.
My logstash.conf file content is as follows:
input {
rabbitmq {
codec => "line"
host => "x.x.x.x"
password => "xxxx"
user => "xxxx"
connection_timeout => 1000
heartbeat => 30
subscription_retry_interval_seconds => 30
queue => "xxxx"
}
}
filter {
grok {
match => [ "message", "%{IP:client_ip} (?<timestamp>[0-9]{4}-[0-9]{2}-[0-9]{2} [0-9]{2}:[0-9]{2}:[0-9]{2}) %{WORD:severity} (?<source>.*\:[0-9]*) (?<whitespaces>[ \t]*) (?<message>.*)" ]
}
mutate {
strip => [ "whitespaces" ]
}
}
output {
elasticsearch {
hosts => ["x.x.x.x:9200", "x.x.x.x:9200"]
index => "test_logging"
action => "index"
}
}
I'm using the pika python driver to push msgs to the queue and can see from the rabbitmq management UI that the msgs are being immediately flushed from the queue...
But, I don't see any documents getting created in the Elastic search DB. Highly appreciate any help/leads... Thanks in advance!
To debug a little more, I've changed the logstash output to print the queue messages to console but couldn't capture any msgs there...