RabbitMQ output not sending any event

Hi,
we are trying to ship some IIS log to Elasticsearch using logstash, with a RabbitMQ instance to act as a buffer, something like (IIS Server [IIS W3C logs] -> [logstash]) -> [RabbitMQ] -> [Logstash] -> [ElasticSearch] (maybe it would be better to use FileBeat on iis server and a separate, centralized logstash if we have more server, but right now we have only one).
The thing is, while logstash do connect to RabbitMQ, and i can see it in RabbitMQ management ui, it doesn't send any event at all.

The config is:

## We have IIS configured to use a single log file for all sites
#   because logstash can't handle parsing files in different
#   directories if they have the same name.
#
input {  
  file {
    type => "iis-w3c"
    path => "D:/LogFiles/W3SVC*/*.log"
  }

}

filter {  
  ## Ignore the comments that IIS will add to the start of the W3C logs
  #
  if [message] =~ "^#" {
    drop {}
  }

  grok {
    ## Very helpful site for building these statements:
    #   http://grokdebug.herokuapp.com/
    #
    # This is configured to parse out every field of IIS's W3C format when
    #   every field is included in the logs
    #
    match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp} %{WORD:serviceName} \<%{WORD:serverName}\> %{IP:serverIP} %{WORD:method} %{URIPATH:uriStem} %{NOTSPACE:uriQuery} %{NUMBER:port} %{NOTSPACE:username} %{IPORHOST:clientIP} %{NOTSPACE:protocolVersion} %{NOTSPACE:userAgent} %{NOTSPACE:cookie} %{NOTSPACE:referer} %{NOTSPACE:requestHost} %{NUMBER:response} %{NUMBER:subresponse} %{NUMBER:win32response} %{NUMBER:bytesSent} %{NUMBER:bytesReceived} %{NUMBER:timetaken}"]
  }

  ## Set the Event Timesteamp from the log
  #
  date {
    match => [ "log_timestamp", "YYYY-MM-dd HH:mm:ss" ]
      timezone => "Etc/UTC"
  }

  ## If the log record has a value for 'bytesSent', then add a new field
  #   to the event that converts it to kilobytes
  #
  if [bytesSent] {
    ruby {
	  code => "event.set('kilobytesSent',event.get('bytesSent').to_i / 1024.0)"
    }
  }


  ## Do the same conversion for the bytes received value
  #
  if [bytesReceived] {
    ruby {
      code => "event.set('kilobytesReceived',event.get('bytesReceived').to_i / 1024.0)"
    }
  }

  ## Perform some mutations on the records to prep them for Elastic
  #
  mutate {
    ## Convert some fields from strings to integers
    #
    convert => ["bytesSent", "integer"]
    convert => ["bytesReceived", "integer"]
    convert => ["timetaken", "integer"]

    ## Create a new field for the reverse DNS lookup below
    #
    add_field => { "clientHostname" => "%{clientIP}" }

    ## Finally remove the original log_timestamp field since the event will
    #   have the proper date on it
    #
    remove_field => [ "log_timestamp"]
  }


  ## Do a reverse lookup on the client IP to get their hostname.
  #
  dns {
    ## Now that we've copied the clientIP into a new field we can
    #   simply replace it here using a reverse lookup
    #
    action => "replace"
    reverse => ["clientHostname"]
  }

  ## Parse out the user agent
  #
    useragent {
        source=> "useragent"
        prefix=> "browser"
    }

}

## We're only going to output these records to Elasticsearch so configure
#   that.
#
output {  
  #elasticsearch {
    #hosts => ["elastichost"]
    #
    ## Log records into month-based indexes
    #
    #index => "%{type}-%{+YYYY.MM}"
  #}
 rabbitmq {
	exchange => "exchange.iis.w3c"
	key => "common"
	exchange_type => "direct"
	durable => true
	host => "rabbithost" 
	user => "user"
	password => "pass"
	port => 5672
  }
  
  ## stdout included just for testing
  #
  #stdout {codec => rubydebug}
}

Inside logstash log files there is no error, but there is no line related to the output plugin after the connection event:

[2017-09-22T16:51:07,459][DEBUG][logstash.outputs.rabbitmq] Connecting to RabbitMQ. Settings: {:vhost=>"/", :hosts=>["rabbithost"], :port=>5672, :user=>"user", :automatic_recovery=>true, :pass=>"pass", :timeout=>0, :heartbeat=>0}
[2017-09-22T16:51:07,587][INFO ][logstash.outputs.rabbitmq] Connected to RabbitMQ at 
[2017-09-22T16:51:07,587][DEBUG][logstash.outputs.rabbitmq] Declaring an exchange {:name=>"exchange.icinga.main", :type=>"direct", :durable=>true}
[2017-09-22T16:51:07,680][DEBUG][logstash.outputs.rabbitmq] Exchange declared

Previously we sent events directly to elasticsearch and we had no problem, so the input and filter parts should be fine.

Any idea?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.